Bias Variance Trade-off

Photo by Cytonn Photography on Unsplash

A good handshake between Bias and Variance. — Tapan

Hi there, I am Tapan here. I will be sharing my knowledge of the bias-variance trade-off here. As you guys might have already heard of this term while learning Machine learning.

What is this and when this comes into the picture?

Once after the model got trained bias-variance comes into the picture. These are otherwise called as prediction errors also. This prediction error understanding could make your model more accurate and will avoid it from overfit and underfit.

A simpler way to understand?

I will give some references from our day to day life or from any books to understand the concept very easily.

  1. The second guy is attempting all the questions but not sure about the answers sort of Overcommitting. i.e Low-bias and High-variance. otherwise called overfitting.
  2. The third guy is just guessing the answers in that exam and filing the answer sheet. he is more like giving more answers so he might get a higher chance to obtain higher marks. but as he doesn’t know the answers he has higher errors. i.e High-bias and Low-variance otherwise called overfitting..
  3. The Fourth guy doesn’t have any clue on the exam. He is just appearing for some reason. i.e High-variance and High bias.

What models tend to where?

What are overfitting and underfitting?

Conclusion

So we should always look forward to fewer errors in the model.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Tapan Kumar Patro

Tapan Kumar Patro

📚 Machine learning | 🤖 Deep Learning | 👀 Computer vision | 🗣 Natural Language processing | 👂 Audio Data | 🖥 End to End Software Development | 🖌