I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

MCP-Fastapi学习
使用GitHub MCP服务器创建的测试存储库
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
FastAPI Hello World Application
A simple Hello World API built with FastAPI and MCP SSE support.
Features
- Root endpoint that returns a Hello World message
- Dynamic greeting endpoint that takes a name parameter
- OpenAI integration with GPT-4o for advanced AI-powered chat completions
- Automatic API documentation with Swagger UI
Prerequisites
- Python 3.7+ (for local setup)
- pip (Python package installer)
- OpenAI API key (for the
/openai
endpoint) - Docker (optional, for containerized setup)
Setup Instructions
You can run this application either locally or using Docker.
Local Setup
1. Clone the repository
git clone https://github.com/xxradar/mcp-test-repo.git
cd mcp-test-repo
2. Create a virtual environment (optional but recommended)
# On macOS/Linux
python -m venv venv
source venv/bin/activate
# On Windows
python -m venv venv
venv\Scripts\activate
3. Install dependencies
pip install -r requirements.txt
4. Run the application
uvicorn main:app --reload
The application will start and be available at http://127.0.0.1:8000
Alternatively, you can run the application directly with Python:
python main.py
Docker Setup
1. Clone the repository
git clone https://github.com/xxradar/mcp-test-repo.git
cd mcp-test-repo
2. Build the Docker image
docker build -t fastapi-hello-world .
3. Run the Docker container
docker run -p 8000:8000 fastapi-hello-world
The application will be available at http://localhost:8000
API Endpoints
-
GET /
: Returns a simple Hello World message -
GET /hello/{name}
: Returns a personalized greeting with the provided name -
GET /openai
: Returns a response from OpenAI's GPT-4o model (accepts an optionalprompt
query parameter) -
GET /docs
: Swagger UI documentation -
GET /redoc
: ReDoc documentation
OpenAI Integration
The /openai
endpoint uses OpenAI's GPT-4o model and requires an OpenAI API key to be set as an environment variable:
Local Setup
# Set the OpenAI API key as an environment variable
export OPENAI_API_KEY=your_api_key_here
# Run the application
uvicorn main:app --reload
Docker Setup
# Run the Docker container with the OpenAI API key
docker run -p 8000:8000 -e OPENAI_API_KEY=your_api_key_here fastapi-hello-world
Example Usage
Using curl
# Get Hello World message
curl http://127.0.0.1:8000/
# Get personalized greeting
curl http://127.0.0.1:8000/hello/John
# Get OpenAI chat completion with default prompt
curl http://127.0.0.1:8000/openai
# Get OpenAI chat completion with custom prompt
curl "http://127.0.0.1:8000/openai?prompt=Tell%20me%20a%20joke%20about%20programming"
Using MCP
Connect to MCP Inspector
npx @modelcontextprotocol/inspector
Using a web browser
- Open http://127.0.0.1:8000/ in your browser for the Hello World message
- Open http://127.0.0.1:8000/hello/John in your browser for a personalized greeting
- Open http://127.0.0.1:8000/openai in your browser to get a response from OpenAI with the default prompt
- Open http://127.0.0.1:8000/openai?prompt=What%20is%20FastAPI? in your browser to get a response about FastAPI
- Open http://127.0.0.1:8000/docs for the Swagger UI documentation
Development
To make changes to the application, edit the main.py
file. The server will automatically reload if you run it with the --reload
flag.
相关推荐
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
A geek-themed horoscope generator blending Bitcoin prices, tech jargon, and astrological whimsy.
Therapist adept at identifying core issues and offering practical advice with images.
Reviews

user_cUcMWiS6
As a dedicated MCP user, I highly recommend the Linear Regression MCP created by HeetVekariya. This tool is exceptionally user-friendly and provides precise linear regression modeling. It’s perfect for both beginners and advanced users needing reliable predictions and analysis. The seamless integration and thorough documentation make it a top choice in the MCP ecosystem. Check it out here: https://mcp.so/server/Linear-Regression-MCP/HeetVekariya