Confidential guide on numerology and astrology, based of GG33 Public information

LinkedIn_mcp_server
Dieser MCP -Server verwendet die Fresh LinkedIn -Profildaten -API, um LinkedIn -Profilinformationen abzurufen. Es ist als Modellkontext -Protokoll (MCP) -Server implementiert und enthält ein einzelnes Tool, Get_profile, das eine LinkedIn -Profil -URL akzeptiert und die Profildaten im JSON -Format zurückgibt.
3 years
Works with Finder
1
Github Watches
0
Github Forks
0
Github Stars
LinkedIn Profile Scraper MCP Server
This MCP server uses the Fresh LinkedIn Profile Data API to fetch LinkedIn profile information. It is implemented as a model context protocol (MCP) server and exposes a single tool, get_profile
, which accepts a LinkedIn profile URL and returns the profile data in JSON format.
Features
- Fetch Profile Data: Retrieves LinkedIn profile information including skills and other settings (with most additional details disabled).
-
Asynchronous HTTP Requests: Uses
httpx
for non-blocking API calls. -
Environment-based Configuration: Reads the
RAPIDAPI_KEY
from your environment variables usingdotenv
.
Prerequisites
- Python 3.7+ – Ensure you are using Python version 3.7 or higher.
- MCP Framework: Make sure the MCP framework is installed.
-
Required Libraries: Install
httpx
,python-dotenv
, and other dependencies. -
RAPIDAPI_KEY: Obtain an API key from RapidAPI and add it to a
.env
file in your project directory (or set it in your environment).
Installation
-
Clone the Repository:
git clone https://github.com/codingaslu/Linkedin_Mcp_Server cd Linkedin_Mcp_Server
-
Install Dependencies:
uv add mcp[cli] httpx requests
-
Set Up Environment Variables:
Create a
.env
file in the project directory with the following content:RAPIDAPI_KEY=your_rapidapi_key_here
Running the Server
To run the MCP server, execute:
uv run linkedin.py
The server will start and listen for incoming requests via standard I/O.
MCP Client Configuration
To connect your MCP client to this server, add the following configuration to your config.json
. Adjust the paths as necessary for your environment:
{
"mcpServers": {
"linkedin_profile_scraper": {
"command": "C:/Users/aiany/.local/bin/uv",
"args": [
"--directory",
"C:/Users/aiany/OneDrive/Desktop/linkedin-mcp/project",
"run",
"linkedin.py"
]
}
}
}
Code Overview
-
Environment Setup: The server uses
dotenv
to load theRAPIDAPI_KEY
required to authenticate with the Fresh LinkedIn Profile Data API. -
API Call: The asynchronous function
get_linkedin_data
makes a GET request to the API with specified query parameters. -
MCP Tool: The
get_profile
tool wraps the API call and returns formatted JSON data, or an error message if the call fails. -
Server Execution: The MCP server is run with the
stdio
transport.
Troubleshooting
-
Missing RAPIDAPI_KEY: If the key is not set, the server will raise a
ValueError
. Make sure the key is added to your.env
file or set in your environment. - API Errors: If the API request fails, the tool will return a message indicating that the profile data could not be fetched.
License
This project is licensed under the MIT License. See the LICENSE file for more details.
相关推荐
Converts Figma frames into front-end code for various mobile frameworks.
Advanced software engineer GPT that excels through nailing the basics.
Take an adjectivised noun, and create images making it progressively more adjective!
Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease
I find academic articles and books for research and literature reviews.
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.
Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)
Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden
🔍 Ermöglichen Sie AI -Assistenten, über eine einfache MCP -Schnittstelle auf PYPI -Paketinformationen zu suchen und auf Paketinformationen zuzugreifen.
Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.
Reviews

user_fhcM8ajm
Linkedin_Mcp_Server by codingaslu is a fantastic tool for developers looking to integrate LinkedIn services effortlessly. The well-documented repository on GitHub provides clear instructions and examples that make the setup process smooth. The welcome message is informative, and the initial URL setup guides users seamlessly. Highly recommend for anyone needing a reliable LinkedIn MCP solution!