MCP cover image
See in Github
2025-02-19

LocalMind是与模型上下文协议完全兼容的本地LLM聊天应用程序。它使用Azure OpenAI作为LLM后端,您可以将其连接到那里的所有MCP服务器。

2

Github Watches

1

Github Forks

1

Github Stars

LocalMind

LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.

Local Development

Create a .env file in the backend folder:

APP_CONFIG_FILE_PATH=config.yaml
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding

Create a config.yaml file in your backend folder:

server:
- name: [SERVER_NAME]
  command: [SERVER_COMMAND]
  args:
  - [SERVER_ARGS]
[...]

To work on the frontend in browser with the python backend up and running:

./dev.sh frontend-dev

To run the Tauri App in development mode with the python backend:

./dev.sh app-dev

RAG MCP Server

If you would like to use or work on the RAG MCP Server, first create a .env file in the rag folder:

AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding

Create venv and install dependecies:

cd rag
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Then add the following config entry to your config.yaml in your backend folder:

server:
- name: rag
  command: [ABSOLUTE_PATH]/rag/.venv/bin/python3
  args:
  - [ABSOLUTE_PATH]/rag/main.py

Important

Currently only works with Azure OpenAI Service.

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Bora Yalcin
  • Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://jgadvisorycpa.com
  • This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • modelcontextprotocol
  • 模型上下文协议服务器

  • OffchainLabs
  • 进行以太坊的实施

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • huahuayu
  • 统一的API网关,用于将多个Etherscan样区块链Explorer API与对AI助手的模型上下文协议(MCP)支持。

    Reviews

    2 (1)
    Avatar
    user_wFKyLCMR
    2025-04-16

    Localmind by timosur is a fantastic MCP application focusing on dynamic local knowledge sharing. Its intuitive interface and seamless integration make it a must-have tool for anyone looking to gain insights and connect with locals in realtime. The detailed documentation and supportive community enhance its overall usability. Highly recommend exploring this innovative product!