Running a Chat UI Agent doesn’t have to involve a complicated setup. By combining Docker with the Vercel AI SDK, it’s possible to build and launch a conversational interface in a clean, reproducible way. Docker ensures that the environment is consistent across machines, while the Vercel AI SDK provides the tools for handling streaming responses and multi-turn interactions. Using Docker Compose, the entire stack can be brought online with a single command, making it easier to experiment locally or move toward production.
The Vercel AI SDK gives you a simple yet powerful framework for building conversational UIs, handling streaming responses, and managing multi-turn interactions. Pair it with Docker, and you’ve got a portable, production-ready Chat UI Agent that runs the same way on your laptop, staging, or production.
We’ll start with the Next.js AI Chatbot template from Vercel, then containerize it using a battle-tested Dockerfile from demo repo. This way, you don’t just get a demo — you get a production-ready deployment.
One command, and your Chat UI is live.
Why this setup works
- Next.js 15: Modern App Router, API routes, and streaming.
- Vercel AI SDK: Simple React hooks and streaming utilities for chat UIs.
- Docker (standalone build): Optimized for production — lean image size, fast startup, and reliable deployments.
This stack covers both developer experience and production readiness.
Step 1: Clone the template
Start with the official Vercel chatbot template:
npx create-next-app@latest chat-ui-agent -e https://vercel.com/templates/ai/nextjs-ai-chatbot
This scaffolds a full-featured chatbot using the Vercel AI SDK.
Step 2: Configure API keys
Create a .env.local file in the root:
OPENAI_API_KEY=your_openai_key_here
Swap in your provider key if you’re using Anthropic or another backend.
Step 3: Add the production Dockerfile
Instead of writing your own Dockerfile, grab the optimized version from Kristiyan Velkov’s repo:
Save it as Dockerfile in your project root.
This file:
- Uses multi-stage builds.
- Creates a standalone Next.js build.
Keeps the image lightweight and fast for production.
Step 4: Docker Compose Setup
Here’s a simple docker-compose.yml:
services:
chat-ui-agent:
build:
context: .
dockerfile: Dockerfile
ports:
- "3000:3000"
environment:
OPENAI_API_KEY: ${OPENAI_API_KEY}
This ensures your API key is passed securely into the container.
Step 5: Build and Run
Spin up your chatbot:
docker-compose up --build
Open http://localhost:3000, and your Chat UI Agent is ready to roll.
Why the standalone Dockerfile matters
Using the standalone Next.js Dockerfile instead of a basic one gives you real advantages:
- Production-grade: Optimized builds, smaller image sizes, faster deploys.
- Best practices baked in: No need to reinvent Docker configs.
- Portable: Same setup runs on local dev, staging, or production servers.
This is the kind of Dockerfile you’d actually ship to production, not just test locally.
Final Thoughts
With the Next.js AI Chatbot template, the Vercel AI SDK, and a production-ready Dockerfile, spinning up a Chat UI Agent is not just quick — it’s deployment-ready from day one.
If you want to move fast without cutting corners, this setup strikes the perfect balance: modern frameworks, clean developer experience, and a solid production pipeline.