Cover image
Try Now
2025-02-19

LocalMind es una aplicación de chat LLM local totalmente compatible con el protocolo de contexto del modelo. Utiliza Azure Openai como un backend de LLM y puede conectarlo a todos los servidores MCP.

3 years

Works with Finder

2

Github Watches

1

Github Forks

1

Github Stars

LocalMind

LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.

Local Development

Create a .env file in the backend folder:

APP_CONFIG_FILE_PATH=config.yaml
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding

Create a config.yaml file in your backend folder:

server:
- name: [SERVER_NAME]
  command: [SERVER_COMMAND]
  args:
  - [SERVER_ARGS]
[...]

To work on the frontend in browser with the python backend up and running:

./dev.sh frontend-dev

To run the Tauri App in development mode with the python backend:

./dev.sh app-dev

RAG MCP Server

If you would like to use or work on the RAG MCP Server, first create a .env file in the rag folder:

AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding

Create venv and install dependecies:

cd rag
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

Then add the following config entry to your config.yaml in your backend folder:

server:
- name: rag
  command: [ABSOLUTE_PATH]/rag/.venv/bin/python3
  args:
  - [ABSOLUTE_PATH]/rag/main.py

Important

Currently only works with Azure OpenAI Service.

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Callycode Limited
  • A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • OffchainLabs
  • Implementación de la prueba de estaca Ethereum

  • huahuayu
  • Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.

  • deemkeen
  • Controle su MBOT2 con un combo de potencia: MQTT+MCP+LLM

    Reviews

    2 (1)
    Avatar
    user_wFKyLCMR
    2025-04-16

    Localmind by timosur is a fantastic MCP application focusing on dynamic local knowledge sharing. Its intuitive interface and seamless integration make it a must-have tool for anyone looking to gain insights and connect with locals in realtime. The detailed documentation and supportive community enhance its overall usability. Highly recommend exploring this innovative product!