Cover image
Try Now
2025-04-07

Ein Beispiel MCP -Server mit ein paar NMAP -Scans als Tools.

3 years

Works with Finder

2

Github Watches

0

Github Forks

3

Github Stars

AI Assistant Chat with Nmap Tool Integration

This project provides a web-based chat interface using Gradio where users can interact with an AI assistant powered by the OpenAI API. The assistant is equipped with tools to interact with the local filesystem and perform network scans using a containerized Nmap server.

Overview

The application uses the OpenAI Agents SDK framework. User requests are processed by an AI agent that can reason about the request and decide whether to use available tools. It features:

  • A Gradio frontend for easy interaction.
  • An AI agent backend leveraging an OpenAI model (requires API key).
  • A Model Context Protocol (MCP) server for filesystem access (using @modelcontextprotocol/server-filesystem).
  • A containerized MCP server providing Nmap scanning capabilities (ping, port scans, service discovery, SMB share enumeration)[cite: 14, 16, 18, 20, 22].

The Nmap server runs inside a Docker container for easy dependency management and isolation.

Features

  • Conversational AI assistant.
  • Filesystem access tool (scoped to the application directory).
  • Network scanning tools via Nmap:
    • ping_host [cite: 14]
    • scan_network (Top 100 ports) [cite: 16]
    • all_scan_network (-A comprehensive scan) [cite: 18]
    • all_ports_scan_network (All 65535 ports) [cite: 20]
    • smb_share_enum_scan (SMB Share Enumeration) [cite: 22]
  • Web-based UI using Gradio[cite: 11, 12].
  • Containerized Nmap tool server using Docker.

Architecture

  1. Gradio UI (app.txt): Handles user input and displays conversation history.
  2. Main Application (app.txt):
    • Initializes Gradio interface.
    • Manages conversation state.
    • Sets up and manages MCP servers.
    • Instantiates and runs the OpenAI Agent.
  3. OpenAI Agent (agents library): Processes user messages, calls tools when needed, and generates responses[cite: 1, 3].
  4. MCP Servers:
    • Filesystem Server: Runs via npx to provide local file access[cite: 1].
    • Nmap Toolkit Server (nmap-server.txt in Docker): Runs inside a Docker container, exposing Nmap scan functions as tools via MCP[cite: 2, 14]. app.txt uses docker run to start this server for each request.

Prerequisites

  • Python: 3.9+
  • Docker: Latest version installed and running.
  • Node.js/npm: Required for npx to run the filesystem MCP server.
  • OpenAI API Key: Set as an environment variable OPENAI_API_KEY.

Installation & Setup

  1. Clone the repository:

    git clone <your-repository-url>
    cd <your-repository-directory>
    
  2. Set OpenAI API Key: Export your API key as an environment variable. Replace your_api_key_here with your actual key.

    • Linux/macOS:
      export OPENAI_API_KEY='your_api_key_here'
      
    • Windows (Command Prompt):
      set OPENAI_API_KEY=your_api_key_here
      
    • Windows (PowerShell):
      $env:OPENAI_API_KEY='your_api_key_here'
      
  3. Build the Nmap Docker Image: Navigate to the directory containing nmap-server.py and Dockerfile, then run:

    docker build -t nmap-mcp-server .
    

    (Ensure the Dockerfile content is correct, especially the MCP package name if it's not modelcontextprotocol)

  4. Install Python Dependencies: It's recommended to use a virtual environment.

    python -m venv venv
    # Activate the virtual environment
    # Linux/macOS:
    source venv/bin/activate
    # Windows:
    .\venv\Scripts\activate
    # Install requirements
    pip install -r requirements.txt
    

Running the Application

Ensure your OpenAI API key is set, Docker is running, and you are in the project's root directory with the virtual environment activated.

python app.py
 An example MCP server with a couple nmap scans as tools.

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • tomoyoshi hirata
  • Sony α7IIIマニュアルアシスタント

  • apappascs
  • Entdecken Sie die umfassendste und aktuellste Sammlung von MCP-Servern auf dem Markt. Dieses Repository dient als zentraler Hub und bietet einen umfangreichen Katalog von Open-Source- und Proprietary MCP-Servern mit Funktionen, Dokumentationslinks und Mitwirkenden.

  • jae-jae
  • MCP -Server für den Fetch -Webseiteninhalt mit dem Headless -Browser von Dramatikern.

  • ravitemer
  • Ein leistungsstarkes Neovim -Plugin für die Verwaltung von MCP -Servern (Modellkontextprotokoll)

  • patruff
  • Brücke zwischen Ollama und MCP -Servern und ermöglicht es lokalen LLMs, Modellkontextprotokoll -Tools zu verwenden

  • pontusab
  • Die Cursor & Windsurf -Community finden Regeln und MCPs

  • av
  • Führen Sie mühelos LLM -Backends, APIs, Frontends und Dienste mit einem Befehl aus.

  • Mintplex-Labs
  • Die All-in-One-Desktop & Docker-AI-Anwendung mit integriertem Lappen, AI-Agenten, No-Code-Agent Builder, MCP-Kompatibilität und vielem mehr.

  • WangRongsheng
  • 🧑‍🚀 全世界最好的 llm 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Zusammenfassung der weltbesten LLM -Ressourcen.

  • modelcontextprotocol
  • Modellkontext -Protokollserver

    Reviews

    2 (1)
    Avatar
    user_VDBtCD7D
    2025-04-17

    The nmap-mcp-server by jarrodcoulter is an impressive tool for anyone needing an advanced network scanner. Its integration with MCP makes it a powerful addition to any IT professional's toolkit. The server runs smoothly and efficiently, providing comprehensive scan results quickly. Highly recommend checking it out on GitHub!