The Rise of Docker AI Agent and Model Context Protocol
I recently had the opportunity to collaborate with Raveendiran RR, a Docker Community Speaker and Generative AI enthusiast, to present on this exciting topic at Cloud-Native LLMOps Day in Bengaluru. Together, we explored the transformative potential of Model Context Protocol in modern AI development workflows, sharing insights with the vibrant tech community. This blog post expands on the key concepts we discussed during our presentation.
In today's rapidly evolving AI landscape, developers face numerous challenges when integrating AI capabilities into their applications. One of the most promising solutions to address these challenges is the Model Context Protocol (MCP), which is gaining significant traction, especially when paired with the container technology. In this blog post, we'll explore what MCP is, why it matters, and how Docker's AI Agent (Gordon) leverages this protocol to provide a seamless AI development experience.
What is Model Context Protocol (MCP)?
The Model Context Protocol (MCP) is a standardized communication protocol that facilitates seamless integration between Large Language Models (LLMs) and external tools. As AI applications become increasingly sophisticated, they need to interact with various data sources, APIs, and software components. MCP provides a consistent way for these systems to communicate, eliminating the fragmentation that previously existed across different AI platforms.
At its core, MCP addresses a fundamental challenge: each LLM provider (like OpenAI, Google's Gemini, Anthropic's Claude, etc.) has developed its own approach to function and tool calling. This fragmentation creates unnecessary complexity for developers who want to leverage AI capabilities across different models. MCP serves as a unifying standard, allowing developers to write integrations once and deploy them across multiple AI platforms.
The Evolution of Generative AI
The journey toward MCP reflects the broader evolution of generative AI:
Understanding AI Agents
Before diving deeper into MCP, it's important to understand what AI agents are. An AI agent is an autonomous system with several key capabilities:
This agent-based approach represents a shift from passive AI assistants to active AI systems that can take initiative in solving problems.
Agent Design Patterns
AI agents can be organized in several patterns:
Agents become significantly more powerful when equipped with tools that extend their capabilities, similar to how Batman or Superman leverage their tools and abilities to solve problems.
The Need for a Standard Protocol
As AI tools proliferated, each major LLM provider (OpenAI, Google, Anthropic, etc.) developed their own methods for function and tool calling. This fragmentation created a complex ecosystem where developers needed to adapt their code for each platform. MCP emerged as a solution, providing a standard protocol that defines how tools should be used across different AI platforms.
What Makes Model Context Protocol unique?
MCP offers several key advantages:
The protocol operates through a client-server architecture:
MCP Message Types and Communication Flow
MCP uses three primary message types for communication:
The communication workflow typically involves:
MCP Client Requirements
For effective operation, MCP clients must handle several core functions:
Current Challenges with MCP Servers
Despite its advantages, implementing MCP servers presents several challenges:
How Docker Addresses These Challenges
This is where Docker's containerization technology provides significant value:
By containerizing MCP servers, Docker addresses the environmental conflicts, simplifies setup, improves isolation, and enhances cross-platform compatibility.
Introducing Docker AI Agent (Project Gordon)
Taking this integration a step further, Docker has developed an AI assistant called Gordon that is integrated directly into Docker Desktop and CLI. Gordon provides:
Key features include:
Gordon's workflow follows an agentic pipeline:
Getting Started with MCP on Docker Desktop
To begin using MCP with Docker, follow these steps:
A sample MCP configuration might look like:
services: time: image: mcp/time postgres: image: mcp/postgres command: postgresql://postgres:dev@host.docker.internal:5433/postgres git: image: mcp/git volumes: - /Users/username:/Users/username gh: image: mcp/github environment: GITHUB_PERSONAL_ACCESS_TOKEN: ${GITHUB_PERSONAL_ACCESS_TOKEN} fetch: image: mcp/fetch This configuration makes time services, PostgreSQL database access, Git repository management, GitHub API access, and web fetching capabilities available to your AI agent.
Getting Started
Recommended by LinkedIn
Greeting Gordon
how are you doing? Listing all the containers
Prompt #1
list all the containers running on my system in a tabular format Prompt #2
docker ai list all the containers running on my system in a tabular format and highlight ones that is consuming maximum space Dockerfile Optimisation
Clone the repo
https://github.com/ajeetraina/todo-list/ cd todo-list/build Build the image with name "huge"
docker build -t huge . Note the size of Docker image 1.8 GB
Let's ask Gordon to optimise this Image
Prompt
docker ai please optimise this Docker image it creates a new Dockerfile file and keeps Dockerfile.bak old too.
docker ai can you optimise my Dockerfile
The RUN command for npm install now includes --mount=type=cache,target=/root/.npm. This uses Docker's BuildKit feature to cache the npm dependencies in the /root/.npm directory.
diff Dockerfile Dockerfile 1c1 < FROM node:21-alpine --- > FROM node:21 4d3 < 6,7c5 < RUN npm install --production < --- > RUN npm install 9d6 < 11d7 < 12a9 > Let's rebuild it again with name "small"
docker build -t small . docker images REPOSITORY TAG IMAGE ID CREATED SIZE small latest 052adc5729e8 7 minutes ago 377MB huge latest 6bcd991ba3e2 30 minutes ago 1.83GB You can see that Gordon optimised the size.
Optimisation using Multi-stage Build
docker ai can you optimise using Multi-stage build It creates the following Dockerfile.
FROM node:21-alpine AS builder WORKDIR /app COPY package*.json ./ RUN npm install --production COPY . . FROM node:21-alpine WORKDIR /app COPY --from=builder /app /app EXPOSE 3000 CMD ["node", "src/index.js"] Let's build it with name "extra-small"
docker build -t extra-small . REPOSITORY TAG IMAGE ID CREATED SIZE extra-small latest 41868a6e197f 3 minutes ago 235MB small latest 052adc5729e8 18 minutes ago 377MB huge latest 6bcd991ba3e2 41 minutes ago 1.83GB Gordon and MCP
Assuming that you have cloned the repo that has gordon-mcp.yml file with the following content:
services: time: image: mcp/time postgres: image: mcp/postgres command: postgresql://postgres:dev@host.docker.internal:5433/postgres git: image: mcp/git volumes: - /Users/ajeetsraina:/Users/ajeetsraina github: image: mcp/github environment: - GITHUB_PERSONAL_ACCESS_TOKEN=${GITHUB_PERSONAL_ACCESS_TOKEN} fetch: image: mcp/fetch fs: image: mcp/filesystem command: - /rootfs volumes: - .:/rootfs List all the MCP Tools
docker ai mcp Initializing time Initializing fs Initializing postgres Initializing fetch Initializing git Initializing github Initialized fs Initialized postgres Initialized github ... ... Github MCP Server
Ensure that you add your PAT to ~/.zshrc like:
export GITHUB_PERSONAL_ACCESS_TOKEN='XXX' Next, source the shell
source ~/.zshrc Prompt
Creating a new GitHub Repo
docker ai can you create a github repo called sensor-analytics, add a README file with random sensor values that includes temp, pressure and humidity in a tbaular format Prompt
$ docker ai can you fetch dockerlabs.collabnix.com and write the summary to a file tests.txt • Calling fetch ✔️ • Calling write_file ✔️ • Calling list_allowed_directories ✔️ • Calling write_file ✔️ The summary of DockerLabs has been successfully written to the file /rootfs/tests.txt. Let me know if you need further assistance ! Validating
cat tests.txt DockerLabs is a comprehensive learning platform for Docker enthusiasts, offering resources for beginners, intermediate, and advanced users. It features over 500 interactive tutorials and guides, accessible via Docker Desktop or browser. Key highlights include community engagement through Slack and Discord, a GitHub repository for contributions, and a variety of blog posts and articles on Docker-related topics. The platform also provides hands-on labs covering Docker core concepts, advanced features, and industry use cases. Additionally, it offers workshops for beginners, tutorials on Dockerfile creation, and guidance on managing Docker containers and volumes.% Using Postgres
docker run -d --name postgres1 -e POSTGRES_PASSWORD=dev -p 5432:5432 postgres:latest docker run -d --name postgres2 -e POSTGRES_PASSWORD=dev -p 5433:5432 postgres:13 docker run -d --name postgres3 -e POSTGRES_PASSWORD=dev -p 5434:5432 postgres:12
-- Create a table for Users CREATE TABLE users ( id SERIAL PRIMARY KEY, name VARCHAR(100) NOT NULL, email VARCHAR(100) UNIQUE NOT NULL, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); -- Create a table for Orders CREATE TABLE orders ( id SERIAL PRIMARY KEY, user_id INT REFERENCES users(id) ON DELETE CASCADE, product_name VARCHAR(100) NOT NULL, price DECIMAL(10,2) NOT NULL, order_date TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); -- Create a table for Products CREATE TABLE products ( id SERIAL PRIMARY KEY, name VARCHAR(100) NOT NULL, description TEXT, price DECIMAL(10,2) NOT NULL, stock INT NOT NULL DEFAULT 0 ); Query the list of tables
SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'; Using Ask Gordon
docker ai list of all tables in the postgres database running in a postgres container named postgres2 Conclusion
The Model Context Protocol represents a significant step forward in standardizing AI tool integration. When combined with Docker's containerization technology and AI agent capabilities, it creates a powerful, flexible foundation for developing AI-enhanced applications.
By addressing the challenges of environment conflicts, cross-platform compatibility, and setup complexity, Docker makes MCP more accessible to developers. The integration of Docker's AI Agent (Gordon) with MCP servers further streamlines the development experience, providing context-aware assistance throughout the container lifecycle.
As the AI landscape continues to evolve, standards like MCP will become increasingly important for ensuring interoperability and reducing fragmentation. Docker's support for this protocol positions it as a key player in the emerging AI development ecosystem.
Resources
Principal AI Engineer | Generative AI & Agentic Systems Leader | Enterprise AI, Cloud & Platforms | Speaker | Patent Holder
9moThis was a great session to strat my MCP journey and you both did a wonderful job!