Home/Glossary/Model Distillation

AI Glossary

Model Distillation

Training a small model to mimic a large one

Definition

Model distillation is a technique for creating smaller, faster models by training them to mimic the behavior of larger "teacher" models. The smaller "student" model learns from the larger model's outputs rather than from raw training data, allowing it to capture much of the larger model's knowledge at a fraction of the size and cost. Many efficient models available via API are distilled versions of larger models.

Related Terms

Back to Glossary