Bias and Variance
A training set is only a subset of the population of data. Bias-variance trade-off talks about characteristics of predictions from the same algorithm if we use different subsets of the population as training set.
Bias is difference between true value and average predictions from model trained on different training set.
Variance is an estimate of how much the average prediction varies when we change the training set.
Bias and variance are the properties of an algorithm rather than a trained model.
Given a training set
For a sample
Note that both measures are over
Bias variance decomposition of least squared error
Least squares error for the model
Thus, for squared loss we have
Bias and Variance decomposition under uncertain measurements
Assume that there is some true function
We use algorithm