MCP cover image
See in Github
2025-04-11

使用MCP服务器的VSCODE扩展,该扩展可以公开语义工具,例如查找使用情况并将其重命名为LLMS

1

Github Watches

11

Github Forks

66

Github Stars

Bifrost - VSCode Dev Tools MCP Server

VSCode Extension Downloads

This VS Code extension provides a Model Context Protocol (MCP) server that exposes VSCode's powerful development tools and language features to AI tools. It enables advanced code navigation, analysis, and manipulation capabilities when using AI coding assistants that support the MCP protocol.

image

Features

  • Language Server Integration: Access VSCode's language server capabilities for any supported language
  • Code Navigation: Find references, definitions, implementations, and more
  • Symbol Search: Search for symbols across your workspace
  • Code Analysis: Get semantic tokens, document symbols, and type information
  • Smart Selection: Use semantic selection ranges for intelligent code selection
  • Code Actions: Access refactoring suggestions and quick fixes
  • HTTP/SSE Server: Exposes language features over an MCP-compatible HTTP server
  • AI Assistant Integration: Ready to work with AI assistants that support the MCP protocol

Usage

Cline Installation

  • Step 1. Install Supergateway
  • Step 2. Add config to cline
  • Step 3. It will show up red but seems to work fine
{
  "mcpServers": {
    "Bifrost": {
      "command": "cmd",
      "args": [
        "/c",
        "npx",
        "-y",
        "supergateway",
        "--sse",
        "http://localhost:8008/sse"
      ],
      "disabled": false,
      "autoApprove": [],
      "timeout": 600
    }
  }
}

Roo Code Installation

  • Step 1: Add the SSE config to your global or project-based MCP configuration
{
  "mcpServers": {
    "Bifrost": {
      "url": "http://localhost:8008/sse"
    }
  }
}

Screenshot_78

Follow this video to install and use with cursor. I have also provided sample rules that can be used in .cursorrules files for better results.

Example Cursor Rules

FOR NEW VERSIONS OF CURSOR, USE THIS CODE

{
  "mcpServers": {
    "Bifrost": {
      "url": "http://localhost:8008/sse"
    }
  }
}

Multiple Project Support

When working with multiple projects, each project can have its own dedicated MCP server endpoint and port. This is useful when you have multiple VS Code windows open or are working with multiple projects that need language server capabilities.

Project Configuration

Create a bifrost.config.json file in your project root:

{
    "projectName": "MyProject",
    "description": "Description of your project",
    "path": "/my-project",
    "port": 5642
}

The server will use this configuration to:

  • Create project-specific endpoints (e.g., http://localhost:5642/my-project/sse)
  • Provide project information to AI assistants
  • Use a dedicated port for each project
  • Isolate project services from other running instances

Example Configurations

  1. Backend API Project:
{
    "projectName": "BackendAPI",
    "description": "Node.js REST API with TypeScript",
    "path": "/backend-api",
    "port": 5643
}
  1. Frontend Web App:
{
    "projectName": "FrontendApp",
    "description": "React frontend application",
    "path": "/frontend-app",
    "port": 5644
}

Port Configuration

Each project should specify its own unique port to avoid conflicts when multiple VS Code instances are running:

  • The port field in bifrost.config.json determines which port the server will use
  • If no port is specified, it defaults to 8008 for backwards compatibility
  • Choose different ports for different projects to ensure they can run simultaneously
  • The server will fail to start if the configured port is already in use, requiring you to either:
    • Free up the port
    • Change the port in the config
    • Close the other VS Code instance using that port

Connecting to Project-Specific Endpoints

Update your AI assistant configuration to use the project-specific endpoint and port:

{
  "mcpServers": {
    "BackendAPI": {
      "url": "http://localhost:5643/backend-api/sse"
    },
    "FrontendApp": {
      "url": "http://localhost:5644/frontend-app/sse"
    }
  }
}

Backwards Compatibility

If no bifrost.config.json is present, the server will use the default configuration:

  • Port: 8008
  • SSE endpoint: http://localhost:8008/sse
  • Message endpoint: http://localhost:8008/message

This maintains compatibility with existing configurations and tools.

Available Tools

The extension provides access to many VSCode language features including:

  • find_usages: Locate all symbol references.
  • go_to_definition: Jump to symbol definitions instantly.
  • find_implementations: Discover implementations of interfaces/abstract methods.
  • get_hover_info: Get rich symbol docs on hover.
  • get_document_symbols: Outline all symbols in a file.
  • get_completions: Context-aware auto-completions.
  • get_signature_help: Function parameter hints and overloads.
  • get_rename_locations: Safely rename symbols across the project.
  • get_code_actions: Quick fixes, refactors, and improvements.
  • get_semantic_tokens: Enhanced highlighting data.
  • get_call_hierarchy: See incoming/outgoing call relationships.
  • get_type_hierarchy: Visualize class and interface inheritance.
  • get_code_lens: Inline insights (references, tests, etc.).
  • get_selection_range: Smart selection expansion for code blocks.
  • get_type_definition: Jump to underlying type definitions.
  • get_declaration: Navigate to symbol declarations.
  • get_document_highlights: Highlight all occurrences of a symbol.
  • get_workspace_symbols: Search symbols across your entire workspace.

Requirements

  • Visual Studio Code version 1.93.0 or higher
  • Appropriate language extensions for the languages you want to work with (e.g., C# extension for C# files)

Installation

  1. Install this extension from the VS Code marketplace
  2. Install any language-specific extensions you need for your development
  3. Open your project in VS Code

Usage

The extension will automatically start an MCP server when activated. To configure an AI assistant to use this server:

  1. The server runs on port 8008 by default
  2. Configure your MCP-compatible AI assistant to connect to:
    • SSE endpoint: http://localhost:8008/sse
    • Message endpoint: http://localhost:8008/message

Available Commands

  • Bifrost MCP: Start Server - Manually start the MCP server on port 8008
  • Bifrost MCP: Start Server on port - Manually start the MCP server on specified port
  • Bifrost MCP: Stop Server - Stop the running MCP server
  • Bifrost MCP: Open Debug Panel - Open the debug panel to test available tools

image

Star History

Star History Chart

Example Tool Usage

Find References

{
  "name": "find_usages",
  "arguments": {
    "textDocument": {
      "uri": "file:///path/to/your/file"
    },
    "position": {
      "line": 10,
      "character": 15
    },
    "context": {
      "includeDeclaration": true
    }
  }
}

Workspace Symbol Search

{
  "name": "get_workspace_symbols",
  "arguments": {
    "query": "MyClass"
  }
}

Debugging

Use the MCP: Open Debug Panel command image

Troubleshooting

If you encounter issues:

  1. Ensure you have the appropriate language extensions installed for your project
  2. Check that your project has loaded correctly in VSCode
  3. Verify that port 8008 is available on your system
  4. Check the VSCode output panel for any error messages

Contributing

Here are Vscodes commands if you want to add additional functionality go ahead. I think we still need rename and a few others. Please feel free to submit issues or pull requests to the GitHub repository.

License

This extension is licensed under the APGL-3.0 License.

相关推荐

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • lumpenspace
  • Take an adjectivised noun, and create images making it progressively more adjective!

  • Lists Tailwind CSS classes in monospaced font

  • https://appia.in
  • Siri Shortcut Finder – your go-to place for discovering amazing Siri Shortcuts with ease

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • jae-jae
  • MCP服务器使用剧作《无头浏览器》获取网页内容。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • ravitemer
  • 一个功能强大的Neovim插件,用于管理MCP(模型上下文协议)服务器

  • patruff
  • Ollama和MCP服务器之间的桥梁,使本地LLMS可以使用模型上下文协议工具

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • n8n-io
  • 具有本机AI功能的公平代码工作流程自动化平台。将视觉构建与自定义代码,自宿主或云相结合,400+集成。

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

    Reviews

    3 (1)
    Avatar
    user_obFExw7q
    2025-04-17

    BifrostMCP by biegehydra is an impressive tool for any MCP application user. The GitHub link makes it easy to access and contribute to the evolving project. Its clear documentation and user-friendly interface have significantly improved my workflow and efficiency. Highly recommend it to anyone in need of a robust and reliable MCP solution!