I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

fal_api_mcp_server
3 years
Works with Finder
1
Github Watches
0
Github Forks
1
Github Stars
fal-api-mcp-server
A Model Context Protocol (MCP) server that provides image generation capabilities using fal.ai's FLUX.1 Pro model.
Components
Resources
This server does not provide any persistent resources as fal.ai is primarily a stateless model execution service.
Tools
The server implements one tool:
-
generate_image: Generates images based on text prompts using fal.ai FLUX.1 Pro
- Required parameters:
-
prompt
: The text prompt to generate the image from
-
- Optional parameters:
-
image_size
: The desired image size (default: "landscape_4_3")- Options: "square_hd", "square", "portrait_4_3", "portrait_16_9", "landscape_4_3", "landscape_16_9"
-
num_images
: The number of images to generate (default: 1) -
enable_safety_checker
: Enable the safety checker (default: true) -
safety_tolerance
: Safety tolerance level 1-6, higher is more permissive (default: "2") -
output_format
: Output image format, "jpeg" or "png" (default: "jpeg")
-
- Required parameters:
Configuration
This server requires a fal.ai API key to function properly. You can obtain an API key by signing up at fal.ai.
The API key should be provided as an environment variable:
FAL_KEY=your_fal_ai_api_key
You can set this environment variable in your shell, or create a .env
file in the same directory as the server with the above content.
Demo
https://github.com/user-attachments/assets/564a0fc3-9204-4399-b1ea-ab6a5c9f2d84
Quickstart
Install
Claude Desktop
On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
Development/Unpublished Servers Configuration
"mcpServers": {
"fal-api-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/path/to/fal-api-mcp-server",
"run",
"fal-api-mcp-server"
],
"env": {
"FAL_KEY": "your_fal_ai_api_key"
}
}
}
Published Servers Configuration
"mcpServers": {
"fal-api-mcp-server": {
"command": "uvx",
"args": [
"fal-api-mcp-server"
],
"env": {
"FAL_KEY": "your_fal_ai_api_key"
}
}
}
Usage
Once the server is configured and running, you can use it with Claude to generate images. Example prompts:
- "Generate an image of a mountain landscape at sunset"
- "Create a portrait of a cyberpunk character with neon lights"
- "Show me a futuristic cityscape with flying cars"
Claude will use the fal.ai FLUX.1 Pro model to generate the requested images.
Development
Building and Publishing
To prepare the package for distribution:
- Sync dependencies and update lockfile:
uv sync
- Build package distributions:
uv build
This will create source and wheel distributions in the dist/
directory.
- Publish to PyPI:
uv publish
Note: You'll need to set PyPI credentials via environment variables or command flags:
- Token:
--token
orUV_PUBLISH_TOKEN
- Or username/password:
--username
/UV_PUBLISH_USERNAME
and--password
/UV_PUBLISH_PASSWORD
Debugging
Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.
You can launch the MCP Inspector via npm
with this command:
npx @modelcontextprotocol/inspector uv --directory /path/to/fal-api-mcp-server run fal-api-mcp-server
Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.
相关推荐
Confidential guide on numerology and astrology, based of GG33 Public information
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
Reviews

user_muFNjJoO
As a dedicated user of the fal_api_mcp_server, I can confidently say that this server has significantly streamlined our API management process. Created by the talented taimo3810, it offers seamless integration and robust performance. Highly recommend checking it out on GitHub if you're looking for reliability and efficiency in your MCP applications!