Cover image
Try Now
2025-04-14

3 years

Works with Finder

0

Github Watches

0

Github Forks

0

Github Stars

Hello Foundry Agent Service with Semantic Kernel as a Function MCP Server

This project uses Microsoft Foundry AI services with Semantic Kernel code, wrapped with an Azure Function and exposed as a remote MCP server.

The bases of inspiration to create this sample template are:

Architecture

Architecture Diagram

Prerequisites

Setup

  1. Clone this repository and open it in VS Code

  2. Create a Python virtual environment:

    python -m venv .venv
    
  3. Create a deployment in your Azure AI services resource:

    Setup via AZD/Bicep

    Simply

    azd provision
    

    ** Note all environment variables needed for AI Foundry are outputed into the /.azure//.env file

    Setup via Portal

    • In the portal, navigate to your Azure OpenAI resource
    • Go to "Model deployments" and click "Create new deployment"
    • Select model name: gpt-4o-mini
    • Select model version: 2024-07-18
    • Give your deployment a name (e.g., chat)
    • Complete the deployment creation
  4. Configure your local.settings.json file with your Azure AI services being careful to set AZURE_AI_INFERENCE_ENDPOINT using value from step 3:

    {
      "IsEncrypted": false,
      "Values": {
          "AzureWebJobsStorage": "UseDevelopmentStorage=true",
          "FUNCTIONS_WORKER_RUNTIME": "python",
          "AZURE_OPENAI_DEPLOYMENT_NAME": "chat",
          "AZURE_AI_INFERENCE_ENDPOINT": "https://<your AI Services resource>.cognitiveservices.azure.com/",
          "AZURE_OPENAI_API_VERSION": "2024-12-01-preview"
      }
    }
    

    Replace the placeholder values with:

    • AZURE_OPENAI_DEPLOYMENT_NAME: The name you gave to your gpt-4o-mini deployment (defaults to "chat")
    • AZURE_AI_INFERENCE_ENDPOINT: Your Azure AI Inference endpoint URL (required)

Running Locally

  1. Install dependencies:

    pip install -r requirements.txt
    
  2. Start the function app:

    func start
    

    Alternatively, in VS Code, press F5 or use the command palette to run "Tasks: Run Task" and select "func: host start".

  3. The function should start, and you'll see a URL for the local HTTP endpoint (typically http://0.0.0.0:7071).

NOTE if VNetEnabled = True you must remember to either use a jump box on that VNET, or add your local machine's public IP address to the Network -> Firewall settings for your AI Services resource, or you will receive errors like Exception: HttpResponseError: (403) Public access is disabled. Please configure private endpoint. or Exception: HttpResponseError: (403) Access denied due to Virtual Network/Firewall rules.

Connect to the local MCP server from a client/host

Testing with MCP Inspector

  1. In a new terminal window, install and run MCP Inspector:

    npx @modelcontextprotocol/inspector
    
  2. CTRL+click to load the MCP Inspector web app from the URL displayed by the app (e.g., http://0.0.0.0:5173/#resources)

  3. In the MCP Inspector interface:

    • Set the transport type to SSE
    • Set the URL to your running Function app's SSE endpoint:
      http://localhost:7071/runtime/webhooks/mcp/sse
      
    • Click Connect

    Note: This step will not work in CodeSpaces.

  4. Once connected, you can:

    • List Tools
    • Click on a tool
    • Run Tool

Testing with VS Code - Github Copilot Agent mode

  1. Add MCP Server from command palette and add URL to your running Function app's SSE endpoint:

    http://0.0.0.0:7071/runtime/webhooks/mcp/sse
    
  2. List MCP Servers from command palette and start the server

  3. In Copilot chat agent mode enter a prompt to trigger the tool, e.g., select some code and enter this prompt

    Say hello using mcp tool
    
  4. When prompted to run the tool, consent by clicking Continue

  5. When you're done, press Ctrl+C in the terminal window to stop the Functions host process.

Deploy to Azure for Remote MCP

Run this azd command to provision the function app, with any required Azure resources, and deploy your code:

azd up

Note API Management can be used for improved security and policies over your MCP Server, and App Service built-in authentication can be used to set up your favorite OAuth provider including Entra.

Connect to your remote MCP server function app from a client

Your client will need a key in order to invoke the new hosted SSE endpoint, which will be of the form https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse. The hosted function requires a system key by default which can be obtained from the portal or the CLI (az functionapp keys list --resource-group <resource_group> --name <function_app_name>). Obtain the system key named mcp_extension.

Connect to remote MCP server in MCP Inspector

For MCP Inspector, you can include the key in the URL:

https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse?code=<your-mcp-extension-system-key>

Connect to remote MCP server in VS Code - GitHub Copilot

For GitHub Copilot within VS Code, you should set the key as the x-functions-key header in mcp.json, and you would use https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse for the URL. The following example is from the mcp.json file included in this repository and uses an input to prompt you to provide the key when you start the server from VS Code.

  1. Click Start on the server remote-mcp-function, inside the mcp.json file:

  2. Enter the name of the function app that you created in the Azure Portal, when prompted by VS Code.

  3. Enter the Azure Functions MCP Extension System Key into the prompt. You can copy this from the Azure portal for your function app by going to the Functions menu item, then App Keys, and copying the mcp_extension key from the System Keys.

  4. In Copilot chat agent mode enter a prompt to trigger the tool, e.g., select some code and enter this prompt

    Say Hello using MCP tool
    

Redeploy your code

You can run the azd up command as many times as you need to both provision your Azure resources and deploy code updates to your function app.

[!NOTE] Deployed code files are always overwritten by the latest deployment package.

Clean up resources

When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs:

azd down

Authentication

This function uses Azure Entra ID (formerly Azure Active Directory) for authentication via DefaultAzureCredential. Make sure you're logged in with the Azure CLI or have appropriate credentials configured.

Troubleshooting

  • If you encounter errors about missing environment variables, ensure your local.settings.json file has the correct values.
  • If authentication fails, run az login to log in with your Azure credentials.
  • If the MCP Inspector cannot connect, verify that your function app is running and the endpoint URL is correct.
  • Irregular behaviors, 404 resource not found errors, and more will happen if AZURE_OPENAI_API_VERSION is set to too old a version for these SDKs. It is recommended to set "AZURE_OPENAI_API_VERSION": "2024-12-01-preview" (or later) in local.settings.json locally, and in your deployed Azure Function deployment in the Environment Variables (this is done for you by default using azd up's bicep files).

相关推荐

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • rulego
  • ⛓️Rulego是一种轻巧,高性能,嵌入式,下一代组件编排规则引擎框架。

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

  • Byaidu
  • PDF科学纸翻译带有保留格式的pdf -基于ai完整保留排版的pdf文档全文双语翻译

  • hkr04
  • 轻巧的C ++ MCP(模型上下文协议)SDK

  • sigoden
  • 使用普通的bash/javascript/python函数轻松创建LLM工具和代理。

  • RockChinQ
  • 😎简单易用、🧩丰富生态 -大模型原生即时通信机器人平台| 适配QQ / 微信(企业微信、个人微信) /飞书 /钉钉 / discord / telegram / slack等平台| 支持chatgpt,deepseek,dify,claude,基于LLM的即时消息机器人平台,支持Discord,Telegram,微信,Lark,Dingtalk,QQ,Slack

  • modelscope
  • 开始以更轻松的方式开始构建具有LLM授权的多代理应用程序。

  • dmayboroda
  • 带有可配置容器的本地对话抹布

    Reviews

    3.8 (4)
    Avatar
    user_X4LxcNzK
    2025-04-24

    The hello-sk-function-mcp by paulyuk has greatly impressed me. Its seamless integration and robust functionality have elevated my experience. Highly recommend for anyone looking for a reliable solution!

    Avatar
    user_zBVKDtUu
    2025-04-24

    As a dedicated user of hello-sk-function-mcp, I can confidently say this product by paulyuk is exceptional. The seamless integration and user-friendly interface make workflow automation effortless. Highly recommended for anyone looking to enhance their MCP applications!

    Avatar
    user_mBwl305g
    2025-04-24

    I've been using hello-sk-function-mcp by paulyuk and it has thoroughly impressed me. The functionality is seamless, and it integrates perfectly into my projects. The setup process was intuitive, and I appreciate the detailed documentation. Kudos to the author for such a remarkable tool!

    Avatar
    user_LTh1w9ms
    2025-04-24

    I've been using the hello-sk-function-mcp by paulyuk for a while now and it's simply outstanding. This tool has truly streamlined my workflow, offering seamless integration and reliable performance. The start URL and welcoming information make the setup process a breeze, even for beginners. Highly recommend it to anyone looking to boost their productivity!