Artificial Intelligence Deep Learning Objective
Aug 16, 2025

What is knowledge distillation?

Choose the correct answer:
A) Transferring knowledge from large to small models
B) Extracting insights from data
C) Purifying training datasets
D) Compressing model weights
Detailed Explanation
A
Discussion (0)

No comments yet. Be the first to share your thoughts!

Share Your Thoughts
Feedback