Agents are the future, and if you haven’t already started building agents, you probably will soon. Across industries and use cases, agents can act on our behalf, and offload repetitive work, because they can act on our behalf with judgment and context.
But while agentic development is moving fast, today it’s tedious, hard, and not fun: you need to quickly iterate with different prompts and models (both frontier models and local/open models), you need to find and connect MCP tools to internal data securely, and you need to declaratively package everything so that others can run your agent. And you need this to be built once, and run anywhere: on your laptop, in CI, or in production.
These problems are not new: they are what Docker was originally conceived for. It’s not an overstatement to say that once upon a time, Docker made microservices possible, and today we’re excited to share how we’re evolving Docker for the era of agents.
Launching today: Compose enters the agent era
Starting today, Docker makes it easy to build, ship, and run agents and agentic applications. Docker Compose launched a decade ago, and solved the problem of how to build and describe multi-container applications. It’s used and loved by millions of developers every day, which is why we’re excited to announce that we have agent building blocks in Compose.
Now, with just a compose.yaml
, you can define your open models, agents, and MCP-compatible tools, then spin up your full agentic stack with a simple docker compose up
. From dev to production (more on this later), your agents are wired, connected, and ready to run.
Not just that. Compose is also seamlessly integrated with today’s most popular agentic frameworks:
- LangGraph – Define your LangGraph workflow, wrap it as a service, plug it into compose.yaml, and run the full graph with
docker compose up
. Try the LangGraph tutorial. - Embabel – Use Compose to connect models, embed tools, and get a complete Embabel environment running. Explore the quickstart guide.
- Vercel AI SDK – Compose makes it easy to stand up supporting agents and services locally. Check out the Vercel AI examples.
- Spring AI – Use Compose to spin up vector stores, model endpoints, and agents alongside your Spring AI backend. View the Spring AI samples.
- CrewAI – Compose lets you containerize CrewAI agents. Try the CrewAI Getting Started guide.
- Google’s ADK – Easily deploy your ADK-based agent stack with Docker Compose- agents, tools, and routing layers all defined in a single file. Try our example.
- Agno – Use Compose to run your Agno-based agents and tools effortlessly. Explore the Agno example.
But the power of the new Docker Compose goes beyond SDKs: it’s deeply integrated with Docker’s broader suite of AI features.
Docker’s MCP Catalog gives you instant access to a growing library of trusted, plug-and-play tools for your agents. No need to dig through repos, worry about compatibility, or wire things up manually. Just drop what you need into your Compose file and you’re up and running.
Docker Model Runner lets you pull open-weight LLMs directly from Docker Hub, run them locally, and interact with them via built-in OpenAI-compatible endpoints, so your existing SDKs and libraries just work, no rewrites, no retooling. And they run with full GPU acceleration. But what if you don’t have enough local resources?
Introducing Docker Offload: Cloud power, local simplicity
When building agents, local resource limits shouldn’t slow you down. That’s why we’re introducing Docker Offload, a truly seamless way to run your models and containers on a cloud GPU.
Docker Offload frees you from infrastructure constraints by offloading compute-intensive workloads, like large language models and multi-agent orchestration, to high-performance cloud environments. No complex setup, no GPU shortages, no configuration headaches.
With native integration into Docker Desktop and Docker Engine, Docker Offload gives you a one-click path from Compose to cloud. Build, test, and scale your agentic applications just like you always have locally, while Docker handles the heavy lifting behind the scenes. It’s the same simple docker compose up
experience, now supercharged with the power of the cloud.
And to get you started, we’re offering 300 minutes of free Offload usage. Try it out, build your agents, and scale effortlessly from your laptop to the cloud.
Compose is now production-ready with Google Cloud and Microsoft Azure
Last, but certainly not least, we’ve worked hard to make sure that the exact same Compose file you used during development works in production, with no rewrites and no reconfiguration.
We’re proud to announce new integrations with Google Cloud Run and Microsoft Azure Container Apps Service that allow Docker Compose to specify a serverless architecture. For example, with Google Cloud, you can deploy your agentic app directly to a serverless environment using the new gcloud run compose up
command. And we’re working closely with Microsoft to bring this seamless experience to Azure as well.
From the first line of YAML to production deployment, Compose makes the entire journey consistent, portable, and effortless, just the way agentic development should be.
Let’s Compose the future. Together.
The future of software is agentic, where every developer builds goal-driven, multi-LLM agents that reason, plan, and act across a rich ecosystem of tools and services.
With Docker Compose, Docker Offload, Docker’s broader AI capabilities, and our partnerships with Google, Microsoft, and Agent SDKs, we’re making that future accessible to, and easy for, everyone.
In short: Docker is the easiest way to build, run, and scale intelligent agents, from development to production.
We can’t wait to see what you create.
リソース
- Docker is simplifying Agent Development
- Explore the capabilities of Docker Offload
- Learn more about our AI Agent: Ask Gordon
- Build Agentic Apps with Docker Compose
- Learn more about Docker Model Runner