Confidential guide on numerology and astrology, based of GG33 Public information

LinkedIn-post-generador
Un servidor de protocolo de contexto modelo (MCP) que automatiza la generación de borradores de publicaciones de LinkedIn desde videos de YouTube. Este servidor proporciona borradores de contenido editables de alta calidad basados en transcripciones de video de YouTube.
3 years
Works with Finder
1
Github Watches
3
Github Forks
8
Github Stars
LinkedIn Post Generator
A Model Context Protocol (MCP) server that automates generating professional LinkedIn post drafts from YouTube videos. This tool streamlines content repurposing by extracting transcripts from YouTube videos, summarizing the content, and generating engaging LinkedIn posts tailored to your preferences.
Table of Contents
Features
- YouTube Transcript Extraction: Automatically extract transcripts from any YouTube video
- Content Summarization: Generate concise summaries with customizable tone and target audience
- LinkedIn Post Generation: Create professional LinkedIn posts with customizable style and tone
- All-in-One Workflow: Go from YouTube URL to LinkedIn post in a single operation
- Customization Options: Adjust tone, audience, word count, and more to match your personal brand
- MCP Integration: Works seamlessly with AI assistants that support the Model Context Protocol
Installation
Local Development
-
Clone the repository:
git clone https://github.com/NvkAnirudh/LinkedIn-Post-Generator.git cd LinkedIn-Post-Generator
-
Install dependencies:
npm install
-
Create a
.env
file based on the example:cp .env.example .env
-
Add your API keys to the
.env
file:OPENAI_API_KEY=your_openai_api_key YOUTUBE_API_KEY=your_youtube_api_key
-
Run the server:
npm run dev
-
Test with MCP Inspector:
npm run inspect
Using with Claude Desktop
This MCP server is designed to work with Claude Desktop and other AI assistants that support the Model Context Protocol. To use it with Claude Desktop:
-
Configure Claude Desktop by editing the configuration file at
~/Library/Application Support/Claude/claude_desktop_config.json
(Mac) or%APPDATA%\Claude\claude_desktop_config.json
(Windows):{ "mcpServers": { "linkedin-post-generator": { "command": "npx", "args": [ "-y", "@smithery/cli@latest", "run", "@NvkAnirudh/linkedin-post-generator", "--key", "YOUR_SMITHERY_API_KEY", "--config", "{\"OPENAI_API_KEY\":\"YOUR_OPENAI_API_KEY\",\"YOUTUBE_API_KEY\":\"YOUR_YOUTUBE_API_KEY\"}", "--transport", "stdio" ] } } }
Replace:
-
YOUR_SMITHERY_API_KEY
with your Smithery API key -
YOUR_OPENAI_API_KEY
with your OpenAI API key -
YOUR_YOUTUBE_API_KEY
with your YouTube API key (optional)
-
-
Restart Claude Desktop
-
In Claude Desktop, you can now access the LinkedIn Post Generator tools without needing to set API keys again
Configuration
The application requires API keys to function properly:
- OpenAI API Key (required): Used for content summarization and post generation
- YouTube API Key (optional): Enhances YouTube metadata retrieval
You can provide these keys in three ways:
1. Via Claude Desktop Configuration (Recommended)
When using with Claude Desktop and Smithery, the best approach is to include your API keys in the Claude Desktop configuration file as shown in the Using with Claude Desktop section. This way, the keys are automatically passed to the MCP server, and you don't need to set them again.
2. As Environment Variables
When running locally, you can set API keys as environment variables in a .env
file:
OPENAI_API_KEY=your_openai_api_key
YOUTUBE_API_KEY=your_youtube_api_key
3. Using the Set API Keys Tool
If you haven't provided API keys through the configuration or environment variables, you can set them directly through the MCP interface using the set_api_keys
tool.
Usage
Available Tools
Set API Keys
- Tool:
set_api_keys
- Purpose: Configure your API keys
- Parameters:
-
openaiApiKey
: Your OpenAI API key (required) -
youtubeApiKey
: Your YouTube API key (optional)
-
Check API Keys
- Tool:
check_api_keys
- Purpose: Verify your API key configuration status
Extract Transcript
- Tool:
extract_transcript
- Purpose: Get the transcript from a YouTube video
- Parameters:
-
youtubeUrl
: URL of the YouTube video
-
Summarize Transcript
- Tool:
summarize_transcript
- Purpose: Create a concise summary of the video content
- Parameters:
-
transcript
: The video transcript text -
tone
: Educational, inspirational, professional, or conversational -
audience
: General, technical, business, or academic -
wordCount
: Approximate word count for the summary (100-300)
-
Generate LinkedIn Post
- Tool:
generate_linkedin_post
- Purpose: Create a LinkedIn post from a summary
- Parameters:
-
summary
: Summary of the video content -
videoTitle
: Title of the YouTube video -
speakerName
: Name of the speaker (optional) -
hashtags
: Relevant hashtags (optional) -
tone
: First-person, third-person, or thought-leader -
includeCallToAction
: Whether to include a call to action
-
All-in-One: YouTube to LinkedIn Post
- Tool:
youtube_to_linkedin_post
- Purpose: Complete workflow from YouTube URL to LinkedIn post
- Parameters:
-
youtubeUrl
: YouTube video URL -
tone
: Desired tone for the post - Plus additional customization options
-
Workflow Example
- Set your API keys using the
set_api_keys
tool - Use the
youtube_to_linkedin_post
tool with a YouTube URL - Receive a complete LinkedIn post draft ready to publish
Deployment
This server is deployed on Smithery, a platform for hosting and sharing MCP servers. The deployment configuration is defined in the smithery.yaml
file.
To deploy your own instance:
- Create an account on Smithery
- Install the Smithery CLI:
npm install -g @smithery/cli
- Deploy the server:
smithery deploy
Contributing
Contributions are welcome and appreciated! Here's how you can contribute to the LinkedIn Post Generator:
Reporting Issues
- Use the GitHub issue tracker to report bugs or suggest features
- Please provide detailed information about the issue, including steps to reproduce, expected behavior, and actual behavior
- Include your environment details (OS, Node.js version, etc.) when reporting bugs
Pull Requests
- Fork the repository
- Create a new branch (
git checkout -b feature/your-feature-name
) - Make your changes
- Run tests to ensure your changes don't break existing functionality
- Commit your changes (
git commit -m 'Add some feature'
) - Push to the branch (
git push origin feature/your-feature-name
) - Open a Pull Request
Development Guidelines
- Follow the existing code style and conventions
- Write clear, commented code
- Include tests for new features
- Update documentation to reflect your changes
Feature Suggestions
If you have ideas for new features or improvements:
- Check existing issues to see if your suggestion has already been proposed
- If not, open a new issue with the label 'enhancement'
- Clearly describe the feature and its potential benefits
Documentation
Improvements to documentation are always welcome:
- Fix typos or clarify existing documentation
- Add examples or use cases
- Improve the structure or organization of the documentation
By contributing to this project, you agree that your contributions will be licensed under the project's MIT License.
License
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Descubra la colección más completa y actualizada de servidores MCP en el mercado. Este repositorio sirve como un centro centralizado, que ofrece un extenso catálogo de servidores MCP de código abierto y propietarios, completos con características, enlaces de documentación y colaboradores.
Manipulación basada en Micrypthon I2C del expansor GPIO de la serie MCP, derivada de AdaFruit_MCP230xx
Espejo dehttps: //github.com/agentience/practices_mcp_server
Espejo de https: //github.com/bitrefill/bitrefill-mcp-server
Servidor MCP para obtener contenido de la página web con el navegador sin cabeza de dramaturgo.
Un bot de chat de IA para equipos pequeños y medianos, que apoyan modelos como Deepseek, Open AI, Claude y Gemini. 专为中小团队设计的 ai 聊天应用 , 支持 Deepseek 、 Open ai 、 Claude 、 Géminis 等模型。
Un poderoso complemento Neovim para administrar servidores MCP (protocolo de contexto del modelo)
Puente entre los servidores Ollama y MCP, lo que permite a LLM locales utilizar herramientas de protocolo de contexto del modelo
Reviews

user_7cUNjLjI
The LinkedIn-Post-Generator by NvkAnirudh is a fantastic tool for anyone looking to craft professional LinkedIn posts with ease. Its intuitive design and robust features make generating compelling content a breeze. Highly recommend this for boosting your LinkedIn presence efficiently! Check it out at: https://github.com/NvkAnirudh/LinkedIn-Post-Generator