I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

useCraper-mcp-server
MCP -Server für die UsesCraper -API. Dies ist ein Server, der eine URL zum Text oder zum Markieren kratzt
1
Github Watches
2
Github Forks
1
Github Stars
UseScraper MCP Server
This is a TypeScript-based MCP server that provides web scraping capabilities using the UseScraper API. It exposes a single tool 'scrape' that can extract content from web pages in various formats.
Features
Tools
-
scrape
- Extract content from a webpage-
Parameters:
-
url
(required): The URL of the webpage to scrape -
format
(optional): The format to save the content (text, html, markdown). Default: markdown -
advanced_proxy
(optional): Use advanced proxy to circumvent bot detection. Default: false -
extract_object
(optional): Object specifying data to extract
-
-
Parameters:
Installation
Installing via Smithery
To install UseScraper for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install usescraper-server --client claude
Manual Installation
-
Clone the repository:
git clone https://github.com/your-repo/usescraper-server.git cd usescraper-server
-
Install dependencies:
npm install
-
Build the server:
npm run build
Configuration
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"usescraper-server": {
"command": "node",
"args": ["/path/to/usescraper-server/build/index.js"],
"env": {
"USESCRAPER_API_KEY": "your-api-key-here"
}
}
}
}
Replace /path/to/usescraper-server
with the actual path to the server and your-api-key-here
with your UseScraper API key.
Usage
Once configured, you can use the 'scrape' tool through the MCP interface. Example usage:
{
"name": "scrape",
"arguments": {
"url": "https://example.com",
"format": "markdown"
}
}
Development
For development with auto-rebuild:
npm run watch
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.
Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.
Fair-Code-Workflow-Automatisierungsplattform mit nativen KI-Funktionen. Kombinieren Sie visuelles Gebäude mit benutzerdefiniertem Code, SelbstHost oder Cloud, 400+ Integrationen.
Ein einheitliches API-Gateway zur Integration mehrerer Ethercan-ähnlicher Blockchain-Explorer-APIs mit Modellkontextprotokoll (MCP) für AI-Assistenten.
Reviews

user_4Af26r7y
The usescraper-mcp-server by tanevanwifferen is an outstanding tool for web scraping enthusiasts. Its seamless integration and user-friendly interface make data extraction a breeze. Highly recommend it for anyone looking to streamline their scraping tasks! Check it out on GitHub.