Cover image
Try Now
2025-04-04

Este proyecto se deriva del paquete original de @modelContextProtocol/Server-Memory, modificado para mis necesidades específicas. Proporcionar capacidades de memoria persistentes para los modelos de IA a través de un gráfico de conocimiento. Esto permite que modelos como Claude retengan y retiren la información a través de las interacciones.

3 years

Works with Finder

1

Github Watches

0

Github Forks

0

Github Stars

MCP Server: Memory

NPM Version Bun Version License

This project is derived from the original @modelcontextprotocol/server-memory package, modified for my specific needs.

An MCP (Model Context Protocol) server providing persistent memory capabilities for AI models through a knowledge graph. This allows models like Claude to retain and recall information across interactions.

Overview

This server implements the Model Context Protocol and acts as a bridge between an AI model and a persistent knowledge graph stored locally. It allows the model to:

  • Create and manage entities (people, places, concepts, etc.).
  • Define relationships between entities.
  • Store observations or facts associated with entities.
  • Search and retrieve information from the knowledge graph.

By default, the knowledge graph is stored in a knowledge_graph.json file in the current working directory where the server is run.

Usage with Claude Desktop

This server is primarily designed to be used with MCP-compatible clients like the Claude Desktop application. You configure it within the client's settings.

Example mcpServers Configuration:

{
  "mcpServers": {
    "memory": {
      "command": "npx",
      "args": ["-y", "mcp-server-memory"],
      "env": {
        "MEMORY_FILE_PATH": "/path/to/your/custom_memory.json"
      }
    }
  }
}

...[rest of existing content remains unchanged...]

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • HiveNexus
  • Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

    Reviews

    4 (1)
    Avatar
    user_v0nqBO5U
    2025-04-16

    I've been using mcp-server-memory by kwanLeeFrmVi and it has been an absolute game-changer for managing server memory effectively. The seamless integration and ease of use make it a must-have for any development project. Highly recommend checking it out on GitHub!