Machine Learning: Bias-Variance-Tradeoff

#Artificial intelligence
Feb 4, 2022

What is the bias variance tradeoff?

The bias-variance tradeoff is always important when it comes to assessing the goodness of fit of predictive models.

Let's start with the basics: Both bias and variance are prediction errors.

The bias of a model is the difference between the average prediction of our model and the correct value we are trying to predict. A model with high bias pays too little attention to the training data and oversimplifies the model. This results in high error rates in training and test data.

The variance of a model indicates the dispersion of the model prediction for a given value. A model with high variance is strongly oriented to the training data and thus cannot respond well to new unknown data. As a result, the performance of the model is very good in predicting the training data, but the error rates are very high for the test data.

The bias-variance tradeoff is that you cannot minimize bias and variance at the same time. If our model is too simple, this leads to a high bias and a low variance. On the other hand, if our model is too complex, it leads to low bias and high variance. Accordingly, it is important to understand what bias and variance mean for the quality of a model and to find a balance.

The effects of bias and variance and their relationship are often visualized using a so-called bulls-eye diagram. In the diagram, the center of the target represents a model that perfectly predicts the correct values.

Source (translated): Towards Data Science

Download PDF

More contributions

Damage good. All good.

Damage good. All good.