Cover image
Try Now
2025-04-07

Un servidor MCP bastante personalizable para LogSeq

3 years

Works with Finder

2

Github Watches

0

Github Forks

5

Github Stars

mcp-pkm-logseq MCP server

A MCP server for interacting with your Logseq Personal Knowledge Management system using custom instructions

Components

Resources

  • logseq://guide - Initial instructions on how to interact with this knowledge base

Tools

  • get_personal_notes_instructions() - Get instructions on how to use the personal notes tool
  • get_personal_notes(topics, from_date, to_date) - Retrieve personal notes from Logseq that are tagged with the specified topics
  • get_todo_list(done, from_date, to_date) - Retrieve the todo list from Logseq

Configuration

The following environment variables can be configured:

  • LOGSEQ_API_KEY: API key for authenticating with Logseq (default: "this-is-my-logseq-mcp-token")
  • LOGSEQ_URL: URL where the Logseq HTTP API is running (default: "http://localhost:12315")

Quickstart

Install

Claude Desktop and Cursor

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Published Servers Configuration
"mcpServers": {
  "mcp-pkm-logseq": {
    "command": "uvx",
    "args": [
      "mcp-pkm-logseq"
    ],
    "env": {
      "LOGSEQ_API_TOKEN": "your-logseq-api-token",
      "LOGSEQ_URL": "http://localhost:12315"
    }
  }
}

Claude Code

claude mcp add mcp-pkm-logseq uvx mcp-pkm-logseq

Start Logseq server

Logseq's HTTP API is an interface that runs within your desktop Logseq application. When enabled, it starts a local HTTP server (default port 12315) that allows programmatic access to your Logseq knowledge base. The API supports querying pages and blocks, searching content, and potentially modifying content through authenticated requests.

To enable the Logseq HTTP API server:

  1. Open Logseq and go to Settings (upper right corner)
  2. Navigate to Advanced
  3. Enable "Developer mode"
  4. Enable "HTTP API Server"
  5. Set your API token (this should match the LOGSEQ_API_KEY value in the MCP server configuration)

For more detailed instructions, see: https://logseq-copilot.eindex.me/doc/setup

Create MCP PKM Logseq Page

Create a page named "MCP PKM Logseq" in your Logseq graph to serve as the guide for AI assistants. Add the following content:

  • Description of your tagging system (e.g., which tags represent projects, areas, resources)
  • List of frequently used tags and what topics they cover
  • Common workflows you use to organize information
  • Naming conventions for pages and blocks
  • Instructions on how you prefer information to be retrieved
  • Examples of useful topic combinations for searching
  • Any context about your personal knowledge management approach

This page will be displayed whenever the AI thinks it needs to understand the user.

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory /Users/ronie/MCP/mcp-pkm-logseq run mcp-pkm-logseq

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Add Development Servers Configuration to Claude Desktop

"mcpServers": {
  "mcp-pkm-logseq": {
    "command": "uv",
    "args": [
      "--directory",
      "/<parent-directories>/mcp-pkm-logseq",
      "run",
      "mcp-pkm-logseq"
    ],
    "env": {
      "LOGSEQ_API_TOKEN": "your-logseq-api-token",
      "LOGSEQ_URL": "http://localhost:12315"
    }
  }
}

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • apappascs
  • Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.

  • ShrimpingIt
  • Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx

  • jae-jae
  • Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.

  • ravitemer
  • Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)

  • patruff
  • Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo

  • JackKuo666
  • 🔍 Habilitar asistentes de IA para buscar y acceder a la información del paquete PYPI a través de una interfaz MCP simple.

  • pontusab
  • La comunidad de cursor y windsurf, encontrar reglas y MCP

  • av
  • Ejecute sin esfuerzo LLM Backends, API, frontends y servicios con un solo comando.

  • Mintplex-Labs
  • La aplicación AI de escritorio todo en uno y Docker con trapo incorporado, agentes de IA, creador de agentes sin código, compatibilidad de MCP y más.

    Reviews

    2 (1)
    Avatar
    user_dJYDWd1R
    2025-04-17

    As a dedicated user of mcp, I've found mcp-pkm-logseq to be an indispensable tool for personal knowledge management. Developed by ruliana, it seamlessly integrates with Logseq, making information organization intuitive and efficient. Highly recommend it to anyone looking to streamline their PKM process! Check it out on GitHub.