Cover image

所有Docker的MCP服务器和提示跑者。简单的标记。 byo llm。

3 years

Works with Finder

6

Github Watches

38

Github Forks

189

Github Stars

This README is an agentic workflow

AI Tools for Developers

Agentic AI workflows enabled by Docker containers.

Just Docker. Just Markdown. BYOLLM.

MCP

Any prompts you write and their tools can now be used as MCP servers

Use serve mode with --mcp flag. Then, register prompts via git ref or path with --register <ref>

# ...
serve
--mcp
--register github:docker/labs-ai-tools-for-devs?path=prompts/examples/generate_dockerfile.md
--register /Users/ai-overlordz/some/local/prompt.md
# ...

overall architecture diagram preview

Source for many experiments in our LinkedIn newsletter

VSCode Extension

Docs

What is this?

This is a simple Docker image which enables infinite possibilities for novel workflows by combining Dockerized Tools, Markdown, and the LLM of your choice.

Markdown is the language

Humans already speak it. So do LLM's. This software allows you to write complex workflows in a markdown files, and then run them with your own LLM in your editor or terminal...or any environment, thanks to Docker.

Dockerized Tools

dockerized tools

OpenAI API compatiable LLM's already support tool calling. We believe these tools could just be Docker images. Some of the benefits using Docker based on our research are enabling the LLM to:

  • take more complex actions
  • get more context with fewer tokens
  • work across a wider range of environments
  • operate in a sandboxed environment

Conversation Loop

The conversation loop is the core of each workflow. Tool results, agent responses, and of course, the markdown prompts, are all passed through the loop. If an agent sees an error, it will try running the tool with different parameters, or even different tools until it gets the right result.

Multi-Model Agents

Each prompt can be configured to be run with different LLM models, or even different model families. This allows you to use the best tool for the job. When you combine these tools, you can create multi-agent workflows where each agent runs with the model best suited for that task.

With Docker, it is possible to have frontier models plan, while lightweight local models execute.

Project-First Design

To get help from an assistant in your software development loop, the only context necessary is the project you are working on.

Extracting project context

extractor architecture

An extractor is a Docker image that runs against a project and extracts information into a JSON context.

Prompts as a trackable artifact

prompts as a trackable artifact

Prompts are stored in a git repo and can be versioned, tracked, and shared for anyone to run in their own environment.

Get Started

We highly recommend using the VSCode extension to get started. It will help you create prompts, and run them with your own LLM.

Running your first loop

VSCode

Install Extension

Get the latest release and install with

code --install-extension 'labs-ai-tools-vscode-<version>.vsix'

Running:

  1. Open an existing markdown file, or create a new markdown file in VSCode.

You can even run this markdown file directly!

  1. Run command >Docker AI: Set OpenAI API Key to set an OpenAI API key, or use a dummy value for local models.

  2. Run command >Docker AI: Select target project to select a project to run the prompt against.

  3. Run command >Docker AI: Run Prompt to start the conversation loop.

CLI

Instructions assume you have a terminal open, and Docker Desktop running.

  1. Set OpenAI key
echo $OPENAI_API_KEY > $HOME/.openai-api-key

Note: we assume this file exists, so you must set a dummy value for local models.

  1. Run the container in your project directory
docker run 
  --rm \
  --pull=always \
  -it \
  -v /var/run/docker.sock:/var/run/docker.sock \
  --mount type=volume,source=docker-prompts,target=/prompts \
  --mount type=bind,source=$HOME/.openai-api-key,target=/root/.openai-api-key \
  vonwig/prompts:latest \
    run \
    --host-dir $PWD \
    --user $USER \
    --platform "$(uname -o)" \
    --prompts "github:docker/labs-githooks?ref=main&path=prompts/git_hooks"

See docs for more details on how to run the conversation loop.

Building

#docker:command=build

docker build -t vonwig/prompts:local -f Dockerfile .

Now, for the agentic workflow...

prompt system

You are an expert at reading readmes.

Use curl to get the readme for https://github.com/docker/labs-ai-tools-for-devs before answering the following questions.

prompt user

What is this project?

相关推荐

  • Joshua Armstrong
  • Confidential guide on numerology and astrology, based of GG33 Public information

  • https://suefel.com
  • Latest advice and best practices for custom GPT development.

  • Emmet Halm
  • Converts Figma frames into front-end code for various mobile frameworks.

  • Elijah Ng Shi Yi
  • Advanced software engineer GPT that excels through nailing the basics.

  • https://maiplestudio.com
  • Find Exhibitors, Speakers and more

  • Alexandru Strujac
  • Efficient thumbnail creator for YouTube videos

  • Yusuf Emre Yeşilyurt
  • I find academic articles and books for research and literature reviews.

  • Carlos Ferrin
  • Encuentra películas y series en plataformas de streaming.

  • https://zenepic.net
  • Embark on a thrilling diplomatic quest across a galaxy on the brink of war. Navigate complex politics and alien cultures to forge peace and avert catastrophe in this immersive interstellar adventure.

  • https://reddgr.com
  • Delivers concise Python code and interprets non-English comments

  • apappascs
  • 发现市场上最全面,最新的MCP服务器集合。该存储库充当集中式枢纽,提供了广泛的开源和专有MCP服务器目录,并提供功能,文档链接和贡献者。

  • ShrimpingIt
  • MCP系列GPIO Expander的基于Micropython I2C的操作,源自ADAFRUIT_MCP230XX

  • pontusab
  • 光标与风浪冲浪社区,查找规则和MCP

  • av
  • 毫不费力地使用一个命令运行LLM后端,API,前端和服务。

  • GeyserMC
  • 与Minecraft客户端/服务器通信的库。

  • 1Panel-dev
  • 🔥1Panel提供了直观的Web接口和MCP服务器,用于在Linux服务器上管理网站,文件,容器,数据库和LLMS。

  • Mintplex-Labs
  • 带有内置抹布,AI代理,无代理构建器,MCP兼容性等的多合一桌面和Docker AI应用程序。

  • awslabs
  • AWS MCP服务器 - 将AWS最佳实践直接带入您的开发工作流程的专门MCP服务器

  • WangRongsheng
  • 🧑‍🚀 llm 资料总结(数据处理、模型训练、模型部署、 o1 模型、mcp 、小语言模型、视觉语言模型)|摘要世界上最好的LLM资源。

    Reviews

    4 (1)
    Avatar
    user_tJMzVHVJ
    2025-04-17

    Labs-ai-tools-for-devs by Docker is a fantastic resource for developers aiming to integrate AI into their projects. The comprehensive tools and libraries provided are user-friendly and efficient, making it a must-have for modern development. With excellent documentation and support, it streamlines AI implementation effortlessly. Highly recommend checking it out!