I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

Installation
-
cd IpServer && npm install && npm run build && npm run start
-
install a local mongodb server and serve it on mongodb://127.0.0.1:27017
-
cd LibreChat && git clone git@github.com:danny-avila/LibreChat.git && mv .env.example .env && npm install && npm run frontend && npm run backend
-
add following configuration to your librechat.yaml file:
mcpServers:
ipServer:
# type: sse # type can optionally be omitted
url: http://localhost:3000/sse
timeout: 60000 # 1 minute timeout for this server, this is the default timeout for MCP servers.
endpoints:
custom:
- name: "Ollama"
apiKey: "ollama"
# use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
baseURL: "http://localhost:11434/v1/chat/completions"
models:
default:
[
"qwen2.5:3b-instruct-q4_K_M",
"mistral:7b-instruct-q4_K_M",
"gemma:7b-instruct-q4_K_M",
]
# fetching list of models is supported but the `name` field must start
# with `ollama` (case-insensitive), as it does in this example.
fetch: true
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
forcePrompt: false
modelDisplayLabel: "Ollama"
- download and run ollama, download a model from https://ollama.ai/models/ and serve ollama on http://localhost:11434/
Usage
-
Visit http://localhost:3080/ to see the LibreChat UI.
-
Create a new agent with the name "Ollama" and select the ollama as the model provider and select a model
-
Click on the Add Tools button below and add the get-external-ip, get-local-ip-v6, get-external-ip-v6, get-local-ip tools
-
Ask agent what's my local ip address? / what's my external ip address? / what's my external ipv6 address? / what's my internal ipv6 address?
-
Agent should invoke your tools and return the results.
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Advanced software engineer GPT that excels through nailing the basics.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Converts Figma frames into front-end code for various mobile frameworks.
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.
Fair-Code-Workflow-Automatisierungsplattform mit nativen KI-Funktionen. Kombinieren Sie visuelles Gebäude mit benutzerdefiniertem Code, SelbstHost oder Cloud, 400+ Integrationen.
🧑🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.
Reviews

user_tYRk86fC
I recently discovered the mcp-server-demo by dev-johnny-gh and I am thoroughly impressed! It's a powerful and well-documented server application that showcases robust functionality and easy deployment. The GitHub repository is well-organized and the welcome information is clear, making the setup process seamless. I highly recommend checking out mcp-server-demo at https://github.com/dev-johnny-gh/mcp-server-demo for an excellent example of server-side development.