Cover image
Try Now
2025-03-17

Chronulus AI预测和预测代理的MCP服务器

3 years

Works with Finder

1

Github Watches

4

Github Forks

50

Github Stars

Chronulus AI

MCP Server for Chronulus

Chat with Chronulus AI Forecasting & Prediction Agents in Claude

Quickstart: Claude for Desktop

Install

Claude for Desktop is currently available on macOS and Windows.

Install Claude for Desktop here

Configuration

Follow the general instructions here to configure the Claude desktop client.

You can find your Claude config at one of the following locations:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Then choose one of the following methods that best suits your needs and add it to your claude_desktop_config.json

Using pip

(Option 1) Install release from PyPI

pip install chronulus-mcp

(Option 2) Install from Github

git clone https://github.com/ChronulusAI/chronulus-mcp.git
cd chronulus-mcp
pip install .
{
  "mcpServers": {
    "chronulus-agents": {
      "command": "python",
      "args": ["-m", "chronulus_mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    }
  }
}

Note, if you get an error like "MCP chronulus-agents: spawn python ENOENT", then you most likely need to provide the absolute path to python. For example /Library/Frameworks/Python.framework/Versions/3.11/bin/python3 instead of just python

Using docker

Here we will build a docker image called 'chronulus-mcp' that we can reuse in our Claude config.

git clone https://github.com/ChronulusAI/chronulus-mcp.git
cd chronulus-mcp
 docker build . -t 'chronulus-mcp'

In your Claude config, be sure that the final argument matches the name you give to the docker image in the build command.

{
  "mcpServers": {
    "chronulus-agents": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-e", "CHRONULUS_API_KEY", "chronulus-mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    }
  }
}
Using uvx

uvx will pull the latest version of chronulus-mcp from the PyPI registry, install it, and then run it.

{
  "mcpServers": {
    "chronulus-agents": {
      "command": "uvx",
      "args": ["chronulus-mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    }
  }
}

Note, if you get an error like "MCP chronulus-agents: spawn uvx ENOENT", then you most likely need to either:

  1. install uv or
  2. Provide the absolute path to uvx. For example /Users/username/.local/bin/uvx instead of just uvx

Additional Servers (Filesystem, Fetch, etc)

In our demo, we use third-party servers like fetch and filesystem.

For details on installing and configure third-party server, please reference the documentation provided by the server maintainer.

Below is an example of how to configure filesystem and fetch alongside Chronulus in your claude_desktop_config.json:

{
  "mcpServers": {
    "chronulus-agents": {
      "command": "uvx",
      "args": ["chronulus-mcp"],
      "env": {
        "CHRONULUS_API_KEY": "<YOUR_CHRONULUS_API_KEY>"
      }
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/path/to/AIWorkspace"
      ]
    },
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    }
  }
} 

Claude Preferences

To streamline your experience using Claude across multiple sets of tools, it is best to add your preferences to under Claude Settings.

You can upgrade your Claude preferences in a couple ways:

  • From Claude Desktop: Settings -> General -> Claude Settings -> Profile (tab)
  • From claude.ai/settings: Profile (tab)

Preferences are shared across both Claude for Desktop and Claude.ai (the web interface). So your instruction need to work across both experiences.

Below are the preferences we used to achieve the results shown in our demos:

## Tools-Dependent Protocols
The following instructions apply only when tools/MCP Servers are accessible.

### Filesystem - Tool Instructions
- Do not use 'read_file' or 'read_multiple_files' on binary files (e.g., images, pdfs, docx) .
- When working with binary files (e.g., images, pdfs, docx) use 'get_info' instead of 'read_*' tools to inspect a file.

### Chronulus Agents - Tool Instructions
- When using Chronulus, prefer to use input field types like TextFromFile, PdfFromFile, and ImageFromFile over scanning the files directly.
- When plotting forecasts from Chronulus, always include the Chronulus-provided forecast explanation below the plot and label it as Chronulus Explanation.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Lists Tailwind CSS classes in monospaced font

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • appcypher
  • 很棒的MCP服务器 - 模型上下文协议服务器的策划列表

    Reviews

    1 (1)
    Avatar
    user_Cni11mAk
    2025-04-17

    As a dedicated user of the chronulus-mcp, I must say this tool has significantly streamlined my workflow. The interface is user-friendly, robust, and highly efficient for managing complex tasks. Developed by ChronulusAI, this application stands out with its reliability and excellent performance. I highly recommend checking it out at their GitHub page!