I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

usescraper-mcp-server
MCP server for the useScraper API. This is a server that scrapes a URL to text or markdown
1
Github Watches
2
Github Forks
1
Github Stars
UseScraper MCP Server
This is a TypeScript-based MCP server that provides web scraping capabilities using the UseScraper API. It exposes a single tool 'scrape' that can extract content from web pages in various formats.
Features
Tools
-
scrape
- Extract content from a webpage-
Parameters:
-
url
(required): The URL of the webpage to scrape -
format
(optional): The format to save the content (text, html, markdown). Default: markdown -
advanced_proxy
(optional): Use advanced proxy to circumvent bot detection. Default: false -
extract_object
(optional): Object specifying data to extract
-
-
Parameters:
Installation
Installing via Smithery
To install UseScraper for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install usescraper-server --client claude
Manual Installation
-
Clone the repository:
git clone https://github.com/your-repo/usescraper-server.git cd usescraper-server
-
Install dependencies:
npm install
-
Build the server:
npm run build
Configuration
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"usescraper-server": {
"command": "node",
"args": ["/path/to/usescraper-server/build/index.js"],
"env": {
"USESCRAPER_API_KEY": "your-api-key-here"
}
}
}
}
Replace /path/to/usescraper-server
with the actual path to the server and your-api-key-here
with your UseScraper API key.
Usage
Once configured, you can use the 'scrape' tool through the MCP interface. Example usage:
{
"name": "scrape",
"arguments": {
"url": "https://example.com",
"format": "markdown"
}
}
Development
For development with auto-rebuild:
npm run watch
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
相关推荐
I find academic articles and books for research and literature reviews.
Evaluator for marketplace product descriptions, checks for relevancy and keyword stuffing.
Confidential guide on numerology and astrology, based of GG33 Public information
This GPT assists in finding a top-rated business CPA - local or virtual. We account for their qualifications, experience, testimonials and reviews. Business operators provide a short description of your business, services wanted, and city or state.
Emulating Dr. Jordan B. Peterson's style in providing life advice and insights.
Your go-to expert in the Rust ecosystem, specializing in precise code interpretation, up-to-date crate version checking, and in-depth source code analysis. I offer accurate, context-aware insights for all your Rust programming questions.
Advanced software engineer GPT that excels through nailing the basics.
Discover the most comprehensive and up-to-date collection of MCP servers in the market. This repository serves as a centralized hub, offering an extensive catalog of open-source and proprietary MCP servers, complete with features, documentation links, and contributors.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Micropython I2C-based manipulation of the MCP series GPIO expander, derived from Adafruit_MCP230xx
Mirror ofhttps://github.com/agentience/practices_mcp_server
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
A unified API gateway for integrating multiple etherscan-like blockchain explorer APIs with Model Context Protocol (MCP) support for AI assistants.
Reviews

user_4Af26r7y
The usescraper-mcp-server by tanevanwifferen is an outstanding tool for web scraping enthusiasts. Its seamless integration and user-friendly interface make data extraction a breeze. Highly recommend it for anyone looking to streamline their scraping tasks! Check it out on GitHub.