Docker + Unsloth: Build Custom Models, Faster

投稿日 11月 18, 2025

Building and Running Custom Models Is Still Hard

Running AI models locally is still hard. Even as open-source LLMs grow more capable, actually getting them to run on your machine, with the right dependencies, remains slow, fragile, and inconsistent.

There’s two sides to this challenge:

  • Model creation and optimization: making fine-tuning and quantization efficient.
  • Model execution and portability: making models reproducible, isolated, and universal.

Solving both lets developers actually use the models they build.

Docker + Unsloth: Making Iterating on Custom Models Faster

A lot of developers want to move away from consume the API to own the model. They want to fine-tune models for their own use cases but doing it remains hard.

We’re excited to be working together with Unsloth to make building, iterating, and running custom LLMs locally faster, simpler, and more accessible for every developer.

Unsloth lowers the barrier to building (and exporting) fine-tuned custom models. Docker lowers the barrier to running them anywhere.

You can now run any model, including Unsloth Dynamic GGUFs, on Mac, Windows or Linux with Docker Model Runner. Together, friction’s removed between experimentation and execution: dependency and reproducibility gaps.

With Docker Model Runner (DMR), starting a model is as simple as docker model run. For example, running OpenAI’s open-weight model locally becomes incredibly easy:

docker model run ai/gpt-oss:20B

仕組み

  1. Fine-tune with Unsloth. Train and optimize your model efficiently.
  2. Export to GGUF. Quantize to a lightweight, portable format for fast local inference.
  3. Run with Docker. Launch instantly with docker model run. No manual setup.

Unsloth’s Dynamic GGUFs help you create compact fine-tuned models. Docker Model Runner lets you spin them up instantly and run them as easily as containers, without worrying about dependency issues.

次のステップ

Building and running AI should feel as natural as developing and shipping code. Just like containers standardized application deployment, we’re now doing the same for AI.

Unsloth + Docker marks one more step in that journey. Learn more in the docs

目次

関連記事