I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

localmind
LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.
2
Github Watches
1
Github Forks
1
Github Stars
LocalMind
LocalMind is an local LLM Chat App fully compatible with the Model Context Protocol. It uses Azure OpenAI as a LLM backend and you can connect it to all MCP Servers out there.
Local Development
Create a .env file in the backend folder:
APP_CONFIG_FILE_PATH=config.yaml
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding
Create a config.yaml file in your backend folder:
server:
- name: [SERVER_NAME]
command: [SERVER_COMMAND]
args:
- [SERVER_ARGS]
[...]
To work on the frontend in browser with the python backend up and running:
./dev.sh frontend-dev
To run the Tauri App in development mode with the python backend:
./dev.sh app-dev
RAG MCP Server
If you would like to use or work on the RAG MCP Server, first create a .env file in the rag folder:
AZURE_OPENAI_API_KEY=x
AZURE_OPENAI_DEPLOYMENT=x
AZURE_OPENAI_ENDPOINT=https://x.openai.azure.com
AZURE_OPENAI_API_VERSION=2024-07-01-preview
AZURE_OPENAI_CHAT_MODEL=gpt-4o
AZURE_OPENAI_EMBEDDINGS_MODEL=embedding
Create venv and install dependecies:
cd rag
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
Then add the following config entry to your config.yaml
in your backend folder:
server:
- name: rag
command: [ABSOLUTE_PATH]/rag/.venv/bin/python3
args:
- [ABSOLUTE_PATH]/rag/main.py
Important
Currently only works with Azure OpenAI Service.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
I find academic articles and books for research and literature reviews.
This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Mirror ofhttps://github.com/agentience/practices_mcp_server
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Reviews

user_wFKyLCMR
Localmind by timosur is a fantastic MCP application focusing on dynamic local knowledge sharing. Its intuitive interface and seamless integration make it a must-have tool for anyone looking to gain insights and connect with locals in realtime. The detailed documentation and supportive community enhance its overall usability. Highly recommend exploring this innovative product!