AI/ML
-
Llama.cpp Gets an Upgrade: Resumable Model Downloads
New: resumable GGUF downloads in llama.cpp. Learn how Docker Model Runner makes models versioned, shareable, and OCI-native for seamless dev-to-prod
Read now
-
Fine-Tuning Local Models with Docker Offload and Unsloth
Learn how to fine-tune models locally with Docker Offload and Unsloth and how smaller models can become practical assistants for real-world problems.
Read now
-
The Trust Paradox: When Your AI Gets Catfished
Learn how MCP prompt-injection exploits trusted tools—and how to defend with context isolation, AI behavior checks, and human-in-the-loop review.
Read now
-
Run, Test, and Evaluate Models and MCP Locally with Docker + Promptfoo
Learn how promptfoo and Docker help developers compare models, evaluate MCP servers, and even perform LLM red-teaming.
Read now
-
Beyond Containers: llama.cpp Now Pulls GGUF Models Directly from Docker Hub
Learn how llama.cpp is using Docker Hub as a powerful, versioned, and centralized repository for your AI models.
Read now
-
Build and Distribute AI Agents and Workflows with cagent
cagent is a new open-source project from Docker that makes it simple to build, run, and share AI agents, without writing a single line of code. Instead of writing code and wrangling Python versions and dependencies when creating AI agents, you define your agent’s behavior, tools, and persona in a single YAML file, making it…
Read now
-
Docker Model Runner General Availability
Docker Model Runner offers a new way for developers to manage, run, and share local AI models with cutting-edge features and more on the way.
Read now
-
MCP Security: A Developer’s Guide
MCP security refers to the controls and risks that govern how agents discover, connect to, and execute MCP servers.
Read now