Docker Model Runner
-
Dec 11, 2025
Docker Model Runner now supports vLLM on Windows
Run vLLM with GPU acceleration on Windows using Docker Model Runner and WSL2. Fast AI inference is here.
Read now
-
Dec 5, 2025
Announcing vLLM v0.12.0, Ministral 3 and DeepSeek-V3.2 for Docker Model Runner
Run Ministral 3 and DeepSeek-V3.2 on Docker Model Runner with vLLM 0.12. Test-drive the latest open-weights models as soon as they’re released.
Read now
-
Dec 1, 2025
Run Embedding Models and Unlock Semantic Search with Docker Model Runner
In this guide, we’ll cover how to use embedding models for semantic search and how to run them with Docker Model Runner.
Read now
-
Nov 20, 2025
Docker Model Runner Integrates vLLM for High-Throughput Inference
New: vLLM in Docker Model Runner. High-throughput inference for safetensors models with auto engine routing for NVIDIA GPUs using Docker.
Read now
-
Nov 18, 2025
Docker + Unsloth: Build Custom Models, Faster
Building and Running Custom Models Is Still Hard Running AI models locally is still hard. Even as open-source LLMs grow more capable, actually getting them to run on your machine, with the right dependencies, remains slow, fragile, and inconsistent. There’s two sides to this challenge: Model creation and optimization: making fine-tuning and quantization efficient. Model…
Read now
-
Nov 3, 2025
How to Use Multimodal AI Models With Docker Model Runner
Run multimodal AI models that understand text, images, and audio with Docker Model Runner. Explore CLI and API examples, run Hugging Face models, and try a real-time webcam vision demo.
Read now
-
Oct 31, 2025
Mr. Bones: A Pirate-Voiced Halloween Chatbot Powered by Docker Model Runner
How to turn a Home Depot skeleton into a live, pirate-voiced chatbot using a local LLM and Docker Model Runner—fast, low-cost, and fully in character.
Read now
-
Oct 21, 2025
Introducing a Richer ”docker model run” Experience
New interactive prompt for docker model run: readline-style editing, history, multi-line input, and Ctrl+C to stop responses. Try it today!
Read now