I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

local
LocalMind es una aplicación de chat LLM local totalmente compatible con el protocolo de contexto del modelo. Utiliza Azure Openai como un backend de LLM y puede conectarlo a todos los servidores MCP.
3 years
Works with Finder
2
Github Watches
1
Github Forks
1
Github Stars
LocalMind
LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.
Local Development
Create a .env file in the backend folder:
APP_CONFIG_FILE_PATH=config.yaml
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding
Create a config.yaml file in your backend folder:
server:
- name: [SERVER_NAME]
command: [SERVER_COMMAND]
args:
- [SERVER_ARGS]
[...]
To work on the frontend in browser with the python backend up and running:
./dev.sh frontend-dev
To run the Tauri App in development mode with the python backend:
./dev.sh app-dev
RAG MCP Server
If you would like to use or work on the RAG MCP Server, first create a .env file in the rag folder:
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding
Create venv and install dependecies:
cd rag
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
Then add the following config entry to your config.yaml
in your backend folder:
server:
- name: rag
command: [ABSOLUTE_PATH]/rag/.venv/bin/python3
args:
- [ABSOLUTE_PATH]/rag/main.py
Important
Currently only works with Azure OpenAI Service.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Una puerta de enlace de API unificada para integrar múltiples API de explorador de blockchain similar a Esterscan con soporte de protocolo de contexto modelo (MCP) para asistentes de IA.
Espejo dehttps: //github.com/agentience/practices_mcp_server
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Reviews

user_wFKyLCMR
Localmind by timosur is a fantastic MCP application focusing on dynamic local knowledge sharing. Its intuitive interface and seamless integration make it a must-have tool for anyone looking to gain insights and connect with locals in realtime. The detailed documentation and supportive community enhance its overall usability. Highly recommend exploring this innovative product!