Cover image

Web搜索工具是一系列工具,允许Claude通过MCP服务器加速Internet

3 years

Works with Finder

1

Github Watches

3

Github Forks

1

Github Stars

WebSearch - Advanced Web Search and Content Extraction Tool

License Python Version Firecrawl uv

A powerful web search and content extraction tool built with Python, leveraging the Firecrawl API for advanced web scraping, searching, and content analysis capabilities.

🚀 Features

  • Advanced Web Search: Perform intelligent web searches with customizable parameters
  • Content Extraction: Extract specific information from web pages using natural language prompts
  • Web Crawling: Crawl websites with configurable depth and limits
  • Web Scraping: Scrape web pages with support for various output formats
  • MCP Integration: Built as a Model Context Protocol (MCP) server for seamless integration

📋 Prerequisites

  • Python 3.8 or higher
  • uv package manager
  • Firecrawl API key
  • OpenAI API key (optional, for enhanced features)
  • Tavily API key (optional, for additional search capabilities)

🛠️ Installation

  1. Install uv:
# On Windows (using pip)
pip install uv

# On Unix/MacOS
curl -LsSf https://astral.sh/uv/install.sh | sh

# Add uv to PATH (Unix/MacOS)
export PATH="$HOME/.local/bin:$PATH"

# Add uv to PATH (Windows - add to Environment Variables)
# Add: %USERPROFILE%\.local\bin
  1. Clone the repository:
git clone https://github.com/yourusername/websearch.git
cd websearch
  1. Create and activate a virtual environment with uv:
# Create virtual environment
uv venv

# Activate on Windows
.\.venv\Scripts\activate.ps1

# Activate on Unix/MacOS
source .venv/bin/activate
  1. Install dependencies with uv:
# Install from requirements.txt
uv sync
  1. Set up environment variables:
# Create .env file
touch .env

# Add your API keys
FIRECRAWL_API_KEY=your_firecrawl_api_key
OPENAI_API_KEY=your_openai_api_key

🎯 Usage

Setting Up With Claude for Desktop

Instead of running the server directly, you can configure Claude for Desktop to access the WebSearch tools:

  1. Locate or create your Claude for Desktop configuration file:

    • Windows: %env:AppData%\Claude\claude_desktop_config.json
    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  2. Add the WebSearch server configuration to the mcpServers section:

{
  "mcpServers": {
    "websearch": {
      "command": "uv",
      "args": [
        "--directory",
        "D:\\ABSOLUTE\\PATH\\TO\\WebSearch",
        "run",
        "main.py"
      ]
    }
  }
}
  1. Make sure to replace the directory path with the absolute path to your WebSearch project folder.

  2. Save the configuration file and restart Claude for Desktop.

  3. Once configured, the WebSearch tools will appear in the tools menu (hammer icon) in Claude for Desktop.

Available Tools

  1. Search

  2. Extract Information

  3. Crawl Websites

  4. Scrape Content

📚 API Reference

Search

  • query (str): The search query
  • Returns: Search results in JSON format

Extract

  • urls (List[str]): List of URLs to extract information from
  • prompt (str): Instructions for extraction
  • enableWebSearch (bool): Enable supplementary web search
  • showSources (bool): Include source references
  • Returns: Extracted information in specified format

Crawl

  • url (str): Starting URL
  • maxDepth (int): Maximum crawl depth
  • limit (int): Maximum pages to crawl
  • Returns: Crawled content in markdown/HTML format

Scrape

  • url (str): Target URL
  • Returns: Scraped content with optional screenshots

🔧 Configuration

Environment Variables

The tool requires certain API keys to function. We provide a .env.example file that you can use as a template:

  1. Copy the example file:
# On Unix/MacOS
cp .env.example .env

# On Windows
copy .env.example .env
  1. Edit the .env file with your API keys:
# OpenAI API key - Required for AI-powered features
OPENAI_API_KEY=your_openai_api_key_here

# Firecrawl API key - Required for web scraping and searching
FIRECRAWL_API_KEY=your_firecrawl_api_key_here

Getting the API Keys

  1. OpenAI API Key:

    • Visit OpenAI's platform
    • Sign up or log in
    • Navigate to API keys section
    • Create a new secret key
  2. Firecrawl API Key:

    • Visit Firecrawl's website
    • Create an account
    • Navigate to your dashboard
    • Generate a new API key

If everything is configured correctly, you should receive a JSON response with search results.

Troubleshooting

If you encounter errors:

  1. Ensure all required API keys are set in your .env file
  2. Verify the API keys are valid and have not expired
  3. Check that the .env file is in the root directory of the project
  4. Make sure the environment variables are being loaded correctly

🤝 Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Firecrawl for their powerful web scraping API
  • OpenAI for AI capabilities
  • MCPThe MCP community for the protocol specification

📬 Contact

José Martín Rodriguez Mortaloni - @m4s1t425 - jmrodriguezm13@gmail.com


Made with ❤️ using Python and Firecrawl

相关推荐

  • NiKole Maxwell
  • I craft unique cereal names, stories, and ridiculously cute Cereal Baby images.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Khalid kalib
  • Write professional emails

  • https://tovuti.be
  • Oede knorrepot die vasthoudt an de goeie ouwe tied van 't boerenleven

  • ANGEL LEON
  • A world class elite tech co-founder entrepreneur, expert in software development, entrepreneurship, marketing, coaching style leadership and aligned with ambition for excellence, global market penetration and worldy perspectives.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Gil kaminski
  • Make sure you are post-ready before you post on social media

  • INFOLAB OPERATIONS 2
  • A medical specialist offering assistance grounded in clinical guidelines. Disclaimer: This is intended for research and is NOT safe for clinical use!

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • OffchainLabs
  • 进行以太坊的实施

  • huahuayu
  • 统一的API网关,用于将多个Etherscan样区块链Explorer API与对AI助手的模型上下文协议(MCP)支持。

  • deemkeen
  • 用电源组合控制您的MBOT2:MQTT+MCP+LLM

  • zhaoyunxing92
  • MCP(消息连接器协议)服务

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

    Reviews

    2 (1)
    Avatar
    user_SMlyQS4t
    2025-04-15

    PromptQL MCP Server by Hasura is a game changer for my application needs. Its seamless integration and efficient performance have significantly improved our development process. Highly recommend it! Check it out here: https://mcp.so/server/promptql-mcp/hasura