I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

YT-to-LinkedIn-MCP-Server
1
Github Watches
0
Github Forks
0
Github Stars
YouTube to LinkedIn MCP Server
A Model Context Protocol (MCP) server that automates generating LinkedIn post drafts from YouTube videos. This server provides high-quality, editable content drafts based on YouTube video transcripts.
Features
- YouTube Transcript Extraction: Extract transcripts from YouTube videos using video URLs
- Transcript Summarization: Generate concise summaries of video content using OpenAI GPT
- LinkedIn Post Generation: Create professional LinkedIn post drafts with customizable tone and style
- Modular API Design: Clean FastAPI implementation with well-defined endpoints
- Containerized Deployment: Ready for deployment on Smithery
Setup Instructions
Prerequisites
- Python 3.8+
- Docker (for containerized deployment)
- OpenAI API Key
- YouTube Data API Key (optional, but recommended for better metadata)
Local Development
-
Clone the repository:
git clone <repository-url> cd yt-to-linkedin
-
Create a virtual environment and install dependencies:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate pip install -r requirements.txt
-
Create a
.env
file in the project root with your API keys:OPENAI_API_KEY=your_openai_api_key YOUTUBE_API_KEY=your_youtube_api_key
-
Run the application:
uvicorn app.main:app --reload
-
Access the API documentation at http://localhost:8000/docs
Docker Deployment
-
Build the Docker image:
docker build -t yt-to-linkedin-mcp .
-
Run the container:
docker run -p 8000:8000 --env-file .env yt-to-linkedin-mcp
Smithery Deployment
-
Ensure you have the Smithery CLI installed and configured.
-
Deploy to Smithery:
smithery deploy
API Endpoints
1. Transcript Extraction
Endpoint: /api/v1/transcript
Method: POST
Description: Extract transcript from a YouTube video
Request Body:
{
"youtube_url": "https://www.youtube.com/watch?v=VIDEO_ID",
"language": "en",
"youtube_api_key": "your_youtube_api_key" // Optional, provide your own YouTube API key
}
Response:
{
"video_id": "VIDEO_ID",
"video_title": "Video Title",
"transcript": "Full transcript text...",
"language": "en",
"duration_seconds": 600,
"channel_name": "Channel Name",
"error": null
}
2. Transcript Summarization
Endpoint: /api/v1/summarize
Method: POST
Description: Generate a summary from a video transcript
Request Body:
{
"transcript": "Video transcript text...",
"video_title": "Video Title",
"tone": "professional",
"audience": "general",
"max_length": 250,
"min_length": 150,
"openai_api_key": "your_openai_api_key" // Optional, provide your own OpenAI API key
}
Response:
{
"summary": "Generated summary text...",
"word_count": 200,
"key_points": [
"Key point 1",
"Key point 2",
"Key point 3"
]
}
3. LinkedIn Post Generation
Endpoint: /api/v1/generate-post
Method: POST
Description: Generate a LinkedIn post from a video summary
Request Body:
{
"summary": "Video summary text...",
"video_title": "Video Title",
"video_url": "https://www.youtube.com/watch?v=VIDEO_ID",
"speaker_name": "Speaker Name",
"hashtags": ["ai", "machinelearning"],
"tone": "professional",
"voice": "first_person",
"audience": "technical",
"include_call_to_action": true,
"max_length": 1200,
"openai_api_key": "your_openai_api_key" // Optional, provide your own OpenAI API key
}
Response:
{
"post_content": "Generated LinkedIn post content...",
"character_count": 800,
"estimated_read_time": "About 1 minute",
"hashtags_used": ["#ai", "#machinelearning"]
}
4. Output Formatting
Endpoint: /api/v1/output
Method: POST
Description: Format the LinkedIn post for output
Request Body:
{
"post_content": "LinkedIn post content...",
"format": "json"
}
Response:
{
"content": {
"post_content": "LinkedIn post content...",
"character_count": 800
},
"format": "json"
}
Environment Variables
Variable | Description | Required |
---|---|---|
OPENAI_API_KEY | OpenAI API key for summarization and post generation | No (can be provided in requests) |
YOUTUBE_API_KEY | YouTube Data API key for fetching video metadata | No (can be provided in requests) |
PORT | Port to run the server on (default: 8000) | No |
Note: While environment variables for API keys are optional (as they can be provided in each request), it's recommended to set them for local development and testing. When deploying to Smithery, users will need to provide their own API keys in the requests.
License
MIT
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Converts Figma frames into front-end code for various mobile frameworks.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
🧑🚀 全世界最好的LLM资料总结(Agent框架、辅助编程、数据处理、模型训练、模型推理、o1 模型、MCP、小语言模型、视觉语言模型) | Summary of the world's best LLM resources.
Mirror ofhttps://github.com/bitrefill/bitrefill-mcp-server
Reviews

user_m7udtNQ1
I've been using LOTUS-MCP by blue-lotus-org and I couldn't be more impressed. It’s highly reliable, efficient, and truly game-changing. The interface is intuitive, which makes navigation easy even for beginners. For any professional seeking a robust MCP application, LOTUS-MCP is undoubtedly a top choice. Highly recommend checking it out at https://mcp.so/server/MCP/blue-lotus-org!