DockerのCISOであるマーク・レクナー氏は、Dockerがソフトウェアサプライチェーンを支えるだけでなく、それを積極的に守る未来のビジョンを共有しています。
Cybersecurity has reached a turning point. The most significant threats no longer exploit isolated systems; they move through the connections between them. The modern attack surface includes every dependency, every container, and every human interaction that connects them.
This interconnected reality is what drew me to Docker.
Over the past decade, I’ve defended banks, fintechs, crypto exchanges, and AI startups against increasingly sophisticated adversaries. Each showed how fragile trust becomes when a software supply chain spans thousands of components.
A significant portion of the world’s software now runs through Docker Hub. Containers have become the default unit of compute. And AI workloads are multiplying both innovation and risk at unprecedented speed.
This is a rare moment, one where getting security right at the foundation can change how the entire industry builds and deploys software.
Lessons from a decade on the supply chain frontline
The environments I worked in may seem unrelated (finance, fintech, crypto, AI) but together they trace how the software supply chain evolved and how security evolved with it.
In my time in neobanks/fintechs, control defined security. We protected finite, closed systems where every dependency was known and internally managed. It was a world built on ownership and predictability. There was a transition underway, and the internal walls between teams were being pulled down. Banking-as-a-Service meant inviting developers into what had always been a sealed environment. Suddenly, trust was not inherited, it had to be proven. That experience crystallized the idea that transparency and verifiability must replace assumptions.
Crypto transformed that lesson into urgency. In that world, the perimeter disappeared entirely. Dependencies, registries, and APIs became active battlefields, often targeted by nation-state actors. The pace of attack compressed from months to minutes.
The Shai Hulud worm that hit npm in September 2025 captures this new reality. It began with a single phishing email spoofing an npm alert. One compromised developer credential became a self-replicating worm spreading across 600+ package versions. The malware didn’t just steal tokens, it automated its own propagation, creating malicious GitHub Actions workflows, publishing private repositories, and moving laterally through the entire ecosystem at CI/CD speed.
Social engineering provided the entry point, and crucially, supply chain automation did the rest.
It was no longer enough to be secure; you had to be provably secure and capable of near-instant remediation.
AI has amplified that acceleration even further. Model supply chains, LLM agents, and the Model Context Protocol (MCP) have introduced entire new layers of exposure: model provenance, data lineage, and automated code generation at massive scale. Security practices are still catching up to the rate of change.
Across all these environments, one constant remained: everything ran in containers. Whether it was a financial risk engine, a crypto trading service, or an AI inference model, it was containerized.
That’s when it became clear to me that Docker isn’t simply part of the supply chain. Docker is the connective layer of modern software itself.
Why Docker is the right platform for this moment
There are three reasons why this moment matters for Docker and for security as a discipline:
Ubiquity with accountability
Every developer interacts with Docker. That ubiquity brings responsibility on a global scale. If Docker strengthens its security foundation, every connected system benefits. If we fall short, the consequences ripple worldwide. That scale is what makes this mission meaningful.
Our role extends beyond individual products. As steward of the container ecosystem, we have a responsibility to make it secure by default. That means setting clear expectations for how software is published, shared, and verified across Docker Hub and the Engine. Imagine a world where every image carries an SBOM and signed provenance by default, where digital signatures are standard, and where organizations can see and control the open source in their supply chain. The container ecosystem has matured, and Docker’s job now is to secure it for the next decade.
Security as a primitive
Virtualization, isolation, and portability are not just features; they are the security primitives of modern computing. Docker is embedding those primitives directly into the developer workflow.
This is reflected in Docker Hardened Images: secure, minimal containers with verifiable provenance and complete SBOMs that help organizations control supply chain risk. Through continuous review we scan, rebuild, and remediate these images at scale, raising the security baseline for the entire open-source ecosystem. Docker Scout complements that process by turning transparency into action, helping teams understand risk context and prioritize what matters most.
Christian Dupuis, lead engineer for Docker Hardened Images, defines the foundation for how Docker builds trust in his recent blog: minimal attack surface, verifiable SBOMs, secure build provenance, exploitability context, and cryptographic verification. Docker Hardened Images bring those pillars to life at scale.
Security is not confined to containers alone. The MCP Gateway enables containerised AI-tool orchestration with isolation, unified control, and observability, extending this same container-secure foundation into the AI era. By embedding policy as code into development, CI/CD, and runtime pipelines, governance becomes inherent; the same containers you trust also enforce the rules you need.
Together, these secure-by-default investments make security self-reinforcing, automated, and aligned with developer speed.
AI as the next frontier in the supply chain
AI workloads are being containerized by default. As teams adopt MCP-based architectures and integrate AI agents into workflows, Docker’s role expands from developer enablement to securing AI infrastructure itself.
Everything we have built through Docker Hardened Images and Scout in the container domain now becomes foundational for this next chapter. The same principles of transparency, provenance, and continuous review will unlock a secure supply chain for AI workloads. Our goal is to provide a platform that scales with this new velocity, enabling innovation while keeping the risks contained.
My vision: From trust to proof
In thinking about the Docker opportunity, I kept returning to one phrase: Trust is not a control.
That is the essence of our approach here. In a modern software supply chain, you cannot simply trust components, you must prove their integrity. The future of security is built on proof: transparent, cryptographically verifiable, and automated.
Docker’s mission is to make that proof accessible to every developer and every organization, without slowing them down.
Here’s what that means in practice:
- Every component should carry its own origin story. Provenance must be verifiable, traceable, and inseparable from the artifact itself. When the history of a component is transparent, trust becomes evidence, not assumption.
- Transparency must be complete, not performative. An SBOM is more than a compliance record; it is a living map of dependencies that reveals how trust flows through a system.
- Policy belongs in the pipeline. When governance is expressed as code, it becomes repeatable and portable, scaling from local development to production without friction. This approach lets each organization apply controls where they fit best, from pre-commit hooks and CI templates to runtime admission checks, so developers can move quickly within guardrails that stay with their work.
- As AI reshapes development, isolation becomes the new perimeter. The ability to experiment safely, within bounded and observable environments, will define whether innovation can remain secure at scale.
These are the building blocks of a provable, scalable security model, one that developers can trust and auditors can verify.
Security should not slow development down. It should enable velocity by removing uncertainty. When the system itself provides proof, developers can build with confidence and organizations can deploy with clarity.
Building the standard for software trust
Eighteen months from now, I want “secure by Docker” to be a recognized assurance.
When enterprises evaluate where to build their most sensitive workloads, Docker’s supply chain posture should be a differentiator, not a checkbox.
Docker Hardened Images will continue to evolve as the industry’s most transparent, source-built container foundation. Docker Scout will deepen visibility and context across dependencies. And our work on policy automation and AI sandboxing will extend those same assurances into new domains.
These aren’t incremental improvements. They are a shift toward verifiable, systemic security; security that is built in, measurable, and accessible to every developer.
If you are navigating supply chain risk, start with Docker Scout. If you want a trusted foundation, use Docker Hardened Images. And if you want to work on the problems that will define the next decade of software integrity, join us.
The world’s software supply chain runs through Docker.
Our mission is to ensure it is secured by Docker too.