Docker Blog
- 
    
	
    	
        
        Accelerate modernization and cloud migrationIn our recent report, we describe that many enterprises today face a stark reality: despite years of digital transformation efforts, the majority of enterprise workloads—up to 80%—still run on legacy systems. This lag in modernization not only increases operational costs and security risks but also limits the agility needed to compete in a rapidly evolving… Read now 
- 
    
	
    	
        
        Retiring Docker Content TrustDocker Content Trust is being deprecated. Learn what this means for DOI pulls and how to prepare for new image signing solutions like Sigstore. Read now 
- 
    
	
    	
        
        Beyond the Chatbot: Event-Driven Agents in ActionLearn how to build an event-driven agentic application, without needing a chat interface, by leveraging Docker MCP Server and Mastra! Read now 
- 
    
	
    	
        
        Docker MCP Catalog: Finding the Right AI Tools for Your ProjectLearn what MCP is and how to find the right AI developer tools with the Docker MCP Catalog. Read now 
- 
    
	
    	
        
        Compose Editing Evolved: Schema-Driven and Context-AwareEvery day, thousands of developers are creating and editing Compose files. At Docker, we are regularly adding more features to Docker Compose such as the new provider services capability that lets you run AI models as part of your multi-container applications with Docker Model Runner. We know that providing a first-class editing experience for Compose… Read now 
- 
    
	
    	
        
        Docker Unveils the Future of Agentic Apps at WeAreDevelopersLet’s unpack our major announcements at WeAreDevelopers and how Docker makes it easier to build, ship, and run this next generation of AI-native software. Read now 
- 
    
	
    	
        
        GoFiber v3 + Testcontainers: Production-like Local Dev with AirSimplify local Go dev with Fiber v3’s new Services API + Testcontainers. Run real DBs in sync with your app—faster loops, fewer hacks. Read now 
- 
    
	
    	
        
        Powering Local AI Together: Docker Model Runner on Hugging FaceDevelopers can use Docker Model Runner as the local inference engine for running models and filtering for Model Runner-supported models on Hugging Face! Read now 
 
                     
                     
                     
                     
                     
                    