AI Definitions: Knowledge distillation

Knowledge distillation (KD) - A machine learning technique transferring the learnings of a large pre-trained model to a smaller model. The “student model” will mimic the predictions of the big one. The smaller one is more agile and efficient, able to make better real-time decisions. It is easier for the smaller model to include explainability in its structure. KD is used in deep learning, particularly for massive deep neural networks.

More AI definitions here.