Confidential guide on numerology and astrology, based of GG33 Public information

Tauchen
Dive ist eine Open-Source-MCP-Host-Desktop-Anwendung, die sich nahtlos in die Funktionen für die Aufruf von LLMs in LLMs integriert. ✨
3 years
Works with Finder
16
Github Watches
72
Github Forks
841
Github Stars
Dive AI Agent 🤿 🤖
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
Features 🎯
- 🌐 Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
- 💻 Cross-Platform: Available for Windows, MacOS, and Linux
- 🔄 Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
- 🌍 Multi-Language Support: Traditional Chinese, Simplified Chinese, English, Spanish with more coming soon
- ⚙️ Advanced API Management: Multiple API keys and model switching support
- 💡 Custom Instructions: Personalized system prompts for tailored AI behavior
- 🔄 Auto-Update Mechanism: Automatically checks for and installs the latest application updates
Recent updates(2025/3/14)
- 🌍 Spanish Translation: Added Spanish language support
- 🤖 Extended Model Support: Added Google Gemini and Mistral AI models integration
Download and Install ⬇️
Get the latest version of Dive:
For Windows users: 🪟
- Download the .exe version
- Python and Node.js environments are pre-installed
For MacOS users: 🍎
- Download the .dmg version
- You need to install Python and Node.js (with npx uvx) environments yourself
- Follow the installation prompts to complete setup
For Linux users: 🐧
- Download the .AppImage version
- You need to install Python and Node.js (with npx uvx) environments yourself
- For Ubuntu/Debian users:
- You may need to add
--no-sandbox
parameter - Or modify system settings to allow sandbox
- Run
chmod +x
to make the AppImage executable
- You may need to add
MCP Tips
While the system comes with a default echo MCP Server, your LLM can access more powerful tools through MCP. Here's how to get started with two beginner-friendly tools: Fetch and Youtube-dl.
Quick Setup
Add this JSON configuration to your Dive MCP settings to enable both tools:
"mcpServers":{
"fetch": {
"command": "uvx",
"args": [
"mcp-server-fetch",
"--ignore-robots-txt"
],
"enabled": true
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/allowed/files"
],
"enabled": true
},
"youtubedl": {
"command": "npx",
"args": [
"@kevinwatt/yt-dlp-mcp"
],
"enabled": true
}
}
Using SSE Server for MCP
You can also connect to an external MCP server via SSE (Server-Sent Events). Add this configuration to your Dive MCP settings:
{
"mcpServers": {
"MCP_SERVER_NAME": {
"enabled": true,
"transport": "sse",
"url": "YOUR_SSE_SERVER_URL"
}
}
}
Additional Setup for yt-dlp-mcp
yt-dlp-mcp requires the yt-dlp package. Install it based on your operating system:
Windows
winget install yt-dlp
MacOS
brew install yt-dlp
Linux
pip install yt-dlp
Build 🛠️
See BUILD.md for more details.
Connect With Us 🌐
- 💬 Join our Discord
- 🐦 Follow us on Twitter/X
- ⭐ Star us on GitHub
- 🐛 Report issues on our Issue Tracker
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
I find academic articles and books for research and literature reviews.
Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.
Delivers concise Python code and interprets non-English comments
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
🔥 1Panel bietet eine intuitive Weboberfläche und einen MCP -Server, um Websites, Dateien, Container, Datenbanken und LLMs auf einem Linux -Server zu verwalten.
Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.
AWS MCP -Server - Spezielle MCP -Server, die AWS -Best Practices direkt in Ihren Entwicklungsworkflow bringen
🧑🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.
Awesome MCP -Server - eine kuratierte Liste von Modellkontext -Protokollservern für Modellkontext
Reviews

user_GbLtFHGt
Dive by OpenAgentPlatform is an exceptional tool for developers and enthusiasts! Its seamless integration and user-friendly interface make complex tasks manageable. The documentation and support provided are top-notch, enhancing the overall experience. Highly recommended for anyone looking to streamline their workflow. Check it out at https://github.com/OpenAgentPlatform/Dive!