Data Science & Analytics Data Science Subjective
Oct 14, 2025

Explain the bias-variance tradeoff in machine learning.

Detailed Explanation
The bias-variance tradeoff describes the relationship between model complexity and generalization error components.\n\n• Bias: Error from oversimplified assumptions (underfitting)\n• Variance: Error from sensitivity to training data fluctuations (overfitting)\n• Total Error = Bias² + Variance + Irreducible Error\n• Tradeoff: Reducing bias often increases variance and vice versa\n\nExample: Linear regression has high bias (assumes linear relationship) but low variance (stable predictions). Random forests have lower bias (capture non-linear patterns) but higher variance (sensitive to training data). Use cross-validation to find optimal complexity.
Discussion (0)

No comments yet. Be the first to share your thoughts!

Share Your Thoughts
Feedback