ML Digest: Bias-Variance Trade-Off

AI/Data Science Digest
1 min readSep 24, 2022

--

Bias-Variance Tradeoff in one slide

Overfitting

If your model overfits, it has a high variance. The model picks up the noise in the underlying training dataset and does not generalize to the unseen test dataset.

Symptoms:

  • Training error is much lower than testing error
  • Training error is lower than the expected error

Remedies:

  • Add more training data
  • Use simpler models
  • Use bagging

Underfitting

If your model underfits, it has a high bias. (e.g. using a very simple linear model for complex data). The model does not capture the patterns in the underlying training dataset.

Symptoms:

  • Training error is higher than expected (often testing error is high as well)

Remedies:

  • Use more complex models
  • Use more features
  • Use boosting

Bias-Variance Tradeoff

Know that bias-variance is a tradeoff, meaning you need to strike a good balance between them in order to reduce the overall error of your model.

That’s it. ML Digest to refresh or learn a new ML concept.

References:

--

--

AI/Data Science Digest
AI/Data Science Digest

Written by AI/Data Science Digest

One Digest At a Time. I value your time! #datascience #AI #GenAI #LLMs #dataanalyst #datascientist #probability #statistics #ML #savetime #digest

No responses yet