Consider Bias as error on training data and Variance as error on test Data for different training samples.
Under fitting model:
High Bias and Low Variance [If you try to fit a simple model such that most of the training data points wonβt be satisfied].
Over fitting model:
Low Bias and High Variance [If you try to fit a model such that most of the training data points would be exactly satisfied].
So, it is very important to build a Perfect Model such that it satisfies most of the training data points and gives better results for the test data [Low Bias and Low Variance].
For detailed explanation, download the following notes on "Bias & Variance" Tradeoff:
https://github.com/ruthvikraja/Bias-Variance.git
Top comments (0)