Confidential guide on numerology and astrology, based of GG33 Public information

allora-mcp-server
3 years
Works with Finder
1
Github Watches
0
Github Forks
1
Github Stars
Allora MCP Server
This is a Model Context Protocol (MCP) server implementation for fetching machine learning inferences from the Allora Network, providing access to Allora's prediction markets data through the Model Context Protocol.
Overview
The Allora MCP server allows AI systems and applications to access Allora prediction markets data through the standardized Model Context Protocol (MCP), enabling seamless integration of prediction market data into AI workflows. This server provides direct access to Allora topics, market predictions, and inference data.
Prerequisites
- Node.js 18 or higher
- An Allora API key (sign up at alloralabs.com)
Setup
- Clone the repository
git clone https://github.com/your-username/allora-mcp.git
cd allora-mcp
- Install dependencies
npm install
- Set up environment variables
Create a .env
file in the project root (or copy from .env.example
):
PORT=3001
ALLORA_API_KEY=your_api_key
Development
npm run dev
Building
npm run build
Running
npm start
Docker
Building the Docker Image
docker build -t allora-mcp .
Running with Docker
# Run the container
docker run -p 3001:3001 -e PORT=3001 -e ALLORA_API_KEY=your_api_key allora-mcp
# Or with environment variables in a file
docker run -p 3001:3001 --env-file .env allora-mcp
Docker Compose (optional)
Create a docker-compose.yml
file:
version: '3'
services:
allora-mcp:
build: .
ports:
- "3001:3001"
environment:
- PORT=3001
- ALLORA_API_KEY=your_api_key
Then run:
docker-compose up
API Usage
Once the server is running, you can interact with it using any MCP client. The server exposes the following endpoints:
-
GET /sse
- SSE connection endpoint for MCP communications -
POST /messages
- Message endpoint for MCP communications
Available Tools
Tool Name | Description | Parameters |
---|---|---|
list_all_topics |
Fetch a list of all Allora topics | None |
get_inference_by_topic_id |
Fetch inference data for a specific topic | topicID : number |
Example Usage with Claude
When connected to Claude or other MCP-compatible AI systems, you can access Allora data with:
What topics are available in Allora?
Or get specific inference data:
What is the current prediction for BTC price in 8 hours?
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Reviews

user_rsNID9wN
I've been using the allora-mcp-server for a while now and I must say it has significantly improved my networking tasks. The allora-network team has done an excellent job developing this server. The setup was straightforward, and the documentation is very thorough. The stability and performance are top-notch. Highly recommend it for anyone in need of a reliable MCP solution.