MCP cover image
See in Github
2025-02-26

Wolframalpha的LLM API的MCP服务器,能够返回结构化知识和求解数学

2

Github Watches

8

Github Forks

27

Github Stars

WolframAlpha LLM MCP Server

wolframalpha-llm-MCP assets/wolfram-llm-logo.png

A Model Context Protocol (MCP) server that provides access to WolframAlpha's LLM API. https://products.wolframalpha.com/llm-api/documentation

wolframalpha-llm-MCP assets/readme-screen-1.png

wolframalpha-llm-MCP assets/readme-screen-2.png

Features

  • Query WolframAlpha's LLM API with natural language questions
  • Answer complicated mathematical questions
  • Query facts about science, physics, history, geography, and more
  • Get structured responses optimized for LLM consumption
  • Support for simplified answers and detailed responses with sections

Available Tools

  • ask_llm: Ask WolframAlpha a question and get a structured llm-friendly response
  • get_simple_answer: Get a simplified answer
  • validate_key: Validate the WolframAlpha API key

Installation

git clone https://github.com/Garoth/wolframalpha-llm-mcp.git
npm install

Configuration

  1. Get your WolframAlpha API key from developer.wolframalpha.com

  2. Add it to your Cline MCP settings file inside VSCode's settings (ex. ~/.config/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json):

{
  "mcpServers": {
    "wolframalpha": {
      "command": "node",
      "args": ["/path/to/wolframalpha-mcp-server/build/index.js"],
      "env": {
        "WOLFRAM_LLM_APP_ID": "your-api-key-here"
      },
      "disabled": false,
      "autoApprove": [
        "ask_llm",
        "get_simple_answer",
        "validate_key"
      ]
    }
  }
}

Development

Setting Up Tests

The tests use real API calls to ensure accurate responses. To run the tests:

  1. Copy the example environment file:

    cp .env.example .env
    
  2. Edit .env and add your WolframAlpha API key:

    WOLFRAM_LLM_APP_ID=your-api-key-here
    

    Note: The .env file is gitignored to prevent committing sensitive information.

  3. Run the tests:

    npm test
    

Building

npm run build

License

MIT

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Contraband Interactive
  • Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.

  • rustassistant.com
  • Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • open-webui
  • 用户友好的AI接口(支持Ollama,OpenAi API,...)

  • metorial
  • 数百个MCP服务器的容器化版本📡📡

  • langgenius
  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

    Reviews

    4 (1)
    Avatar
    user_ikioIQ5P
    2025-04-17

    I have been using the WolframAlpha-LLM-MCP by Garoth, and it has been an impressive tool for leveraging WolframAlpha's capabilities within my applications. The integration is seamless, and the documentation on GitHub is thorough, making it easy to get started. Highly recommend for anyone looking to enhance their projects with powerful computational intelligence.