
hello-sk-function-mcp
3 years
Works with Finder
0
Github Watches
0
Github Forks
0
Github Stars
Hello Foundry Agent Service with Semantic Kernel as a Function MCP Server
This project uses Microsoft Foundry AI services with Semantic Kernel code, wrapped with an Azure Function and exposed as a remote MCP server.
The bases of inspiration to create this sample template are:
- https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/develop/semantic-kernel
- https://aka.ms/mcp-remote
Architecture
Prerequisites
- Azure Functions Core Tools
- Python 3.10+
- VS Code with Azure Functions extension
- Azure AI Services resource with deployed models
- Node.js and npm (for running MCP Inspector)
Setup
-
Clone this repository and open it in VS Code
-
Create a Python virtual environment:
python -m venv .venv
-
Create a deployment in your Azure AI services resource:
Setup via AZD/Bicep
Simply
azd provision
** Note all environment variables needed for AI Foundry are outputed into the /.azure/
/.env file Setup via Portal
- In the portal, navigate to your Azure OpenAI resource
- Go to "Model deployments" and click "Create new deployment"
- Select model name:
gpt-4o-mini
- Select model version:
2024-07-18
- Give your deployment a name (e.g.,
chat
) - Complete the deployment creation
-
Configure your
local.settings.json
file with your Azure AI services being careful to setAZURE_AI_INFERENCE_ENDPOINT
using value from step 3:{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "python", "AZURE_OPENAI_DEPLOYMENT_NAME": "chat", "AZURE_AI_INFERENCE_ENDPOINT": "https://<your AI Services resource>.cognitiveservices.azure.com/", "AZURE_OPENAI_API_VERSION": "2024-12-01-preview" } }
Replace the placeholder values with:
-
AZURE_OPENAI_DEPLOYMENT_NAME
: The name you gave to your gpt-4o-mini deployment (defaults to"chat"
) -
AZURE_AI_INFERENCE_ENDPOINT
: Your Azure AI Inference endpoint URL (required)
-
Running Locally
-
Install dependencies:
pip install -r requirements.txt
-
Start the function app:
func start
Alternatively, in VS Code, press
F5
or use the command palette to run "Tasks: Run Task" and select "func: host start". -
The function should start, and you'll see a URL for the local HTTP endpoint (typically http://0.0.0.0:7071).
NOTE if VNetEnabled = True you must remember to either use a jump box on that VNET, or add your local machine's public IP address to the Network -> Firewall settings for your AI Services resource, or you will receive errors like
Exception: HttpResponseError: (403) Public access is disabled. Please configure private endpoint.
orException: HttpResponseError: (403) Access denied due to Virtual Network/Firewall rules.
Connect to the local MCP server from a client/host
Testing with MCP Inspector
-
In a new terminal window, install and run MCP Inspector:
npx @modelcontextprotocol/inspector
-
CTRL+click to load the MCP Inspector web app from the URL displayed by the app (e.g., http://0.0.0.0:5173/#resources)
-
In the MCP Inspector interface:
- Set the transport type to SSE
- Set the URL to your running Function app's SSE endpoint:
http://localhost:7071/runtime/webhooks/mcp/sse
- Click Connect
Note: This step will not work in CodeSpaces.
-
Once connected, you can:
- List Tools
- Click on a tool
- Run Tool
Testing with VS Code - Github Copilot Agent mode
-
Add MCP Server from command palette and add URL to your running Function app's SSE endpoint:
http://0.0.0.0:7071/runtime/webhooks/mcp/sse
-
List MCP Servers from command palette and start the server
-
In Copilot chat agent mode enter a prompt to trigger the tool, e.g., select some code and enter this prompt
Say hello using mcp tool
-
When prompted to run the tool, consent by clicking Continue
-
When you're done, press Ctrl+C in the terminal window to stop the Functions host process.
Deploy to Azure for Remote MCP
Run this azd command to provision the function app, with any required Azure resources, and deploy your code:
azd up
Note API Management can be used for improved security and policies over your MCP Server, and App Service built-in authentication can be used to set up your favorite OAuth provider including Entra.
Connect to your remote MCP server function app from a client
Your client will need a key in order to invoke the new hosted SSE endpoint, which will be of the form https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse
. The hosted function requires a system key by default which can be obtained from the portal or the CLI (az functionapp keys list --resource-group <resource_group> --name <function_app_name>
). Obtain the system key named mcp_extension
.
Connect to remote MCP server in MCP Inspector
For MCP Inspector, you can include the key in the URL:
https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse?code=<your-mcp-extension-system-key>
Connect to remote MCP server in VS Code - GitHub Copilot
For GitHub Copilot within VS Code, you should set the key as the x-functions-key
header in mcp.json
, and you would use https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse
for the URL. The following example is from the mcp.json file included in this repository and uses an input to prompt you to provide the key when you start the server from VS Code.
-
Click Start on the server
remote-mcp-function
, inside the mcp.json file: -
Enter the name of the function app that you created in the Azure Portal, when prompted by VS Code.
-
Enter the
Azure Functions MCP Extension System Key
into the prompt. You can copy this from the Azure portal for your function app by going to the Functions menu item, then App Keys, and copying themcp_extension
key from the System Keys. -
In Copilot chat agent mode enter a prompt to trigger the tool, e.g., select some code and enter this prompt
Say Hello using MCP tool
Redeploy your code
You can run the azd up
command as many times as you need to both provision your Azure resources and deploy code updates to your function app.
[!NOTE] Deployed code files are always overwritten by the latest deployment package.
Clean up resources
When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs:
azd down
Authentication
This function uses Azure Entra ID (formerly Azure Active Directory) for authentication via DefaultAzureCredential. Make sure you're logged in with the Azure CLI or have appropriate credentials configured.
Troubleshooting
- If you encounter errors about missing environment variables, ensure your
local.settings.json
file has the correct values. - If authentication fails, run
az login
to log in with your Azure credentials. - If the MCP Inspector cannot connect, verify that your function app is running and the endpoint URL is correct.
- Irregular behaviors, 404 resource not found errors, and more will happen if
AZURE_OPENAI_API_VERSION
is set to too old a version for these SDKs. It is recommended to set"AZURE_OPENAI_API_VERSION": "2024-12-01-preview"
(or later) in local.settings.json locally, and in your deployed Azure Function deployment in the Environment Variables (this is done for you by default usingazd up
's bicep files).
相关推荐
🔥 1Panel proporciona una interfaz web intuitiva y un servidor MCP para administrar sitios web, archivos, contenedores, bases de datos y LLM en un servidor de Linux.
🧑🚀 全世界最好的 llM 资料总结(数据处理、模型训练、模型部署、 O1 模型、 MCP 、小语言模型、视觉语言模型) | Resumen de los mejores recursos del mundo.
⛓️Rulego es un marco de motor de regla de orquestación de componentes de alta generación de alto rendimiento, de alto rendimiento y de alto rendimiento para GO.
Traducción de papel científico en PDF con formatos preservados - 基于 Ai 完整保留排版的 PDF 文档全文双语翻译 , 支持 支持 支持 支持 支持 支持 支持 支持 支持 支持 支持 支持 等服务 等服务 等服务 提供 提供 提供 提供 提供 提供 提供 提供 提供 提供 提供 提供 cli/mcp/docker/zotero
Cree fácilmente herramientas y agentes de LLM utilizando funciones Plain Bash/JavaScript/Python.
😎简单易用、🧩丰富生态 - 大模型原生即时通信机器人平台 | 适配 Qq / 微信(企业微信、个人微信) / 飞书 / 钉钉 / Discord / Telegram / Slack 等平台 | 支持 Chatgpt 、 Deepseek 、 DiFy 、 Claude 、 Gemini 、 Xai 、 PPIO 、 Ollama 、 LM Studio 、阿里云百炼、火山方舟、 Siliconflow 、 Qwen 、 Moonshot 、 Chatglm 、 SillyTraven 、 MCP 等 LLM 的机器人 / Agente | Plataforma de bots de mensajería instantánea basada en LLM, admite Discord, Telegram, WeChat, Lark, Dingtalk, QQ, Slack
Iniciar aplicaciones de múltiples agentes empoderadas con Building LLM de manera más fácil.
Reviews

user_X4LxcNzK
The hello-sk-function-mcp by paulyuk has greatly impressed me. Its seamless integration and robust functionality have elevated my experience. Highly recommend for anyone looking for a reliable solution!

user_zBVKDtUu
As a dedicated user of hello-sk-function-mcp, I can confidently say this product by paulyuk is exceptional. The seamless integration and user-friendly interface make workflow automation effortless. Highly recommended for anyone looking to enhance their MCP applications!

user_mBwl305g
I've been using hello-sk-function-mcp by paulyuk and it has thoroughly impressed me. The functionality is seamless, and it integrates perfectly into my projects. The setup process was intuitive, and I appreciate the detailed documentation. Kudos to the author for such a remarkable tool!

user_LTh1w9ms
I've been using the hello-sk-function-mcp by paulyuk for a while now and it's simply outstanding. This tool has truly streamlined my workflow, offering seamless integration and reliable performance. The start URL and welcoming information make the setup process a breeze, even for beginners. Highly recommend it to anyone looking to boost their productivity!