DEV Community

Super Kai (Kazuya Ito)
Super Kai (Kazuya Ito)

Posted on

Overfitting vs Underfitting

Buy Me a Coffee

*Memos:

  • My post explains Vanishing Gradient Problem, Exploding Gradient Problem and Dying ReLU Problem.
  • My post explains layers in PyTorch.
  • My post explains activation functions in PyTorch.
  • My post explains loss functions in PyTorch.
  • My post explains optimizers in PyTorch.

Image description

*Both overfitting and underfitting can be detected by Holdout Method or Cross Validation(K-Fold Cross-Validation). *Cross Validation is better.

Overfitting:

  • is the problem which a model can make accurate predictions for train data a lot but a little for new data(including test data) so the model fits train data much more than new data.
  • occurs because:
    • train data is small(not enough) so the model can only learn a small number of patterns.
    • train data is imbalanced(biased) having a lot of specific(limitted), similar or same data but not a lot of various data so the model can only learn a small number of patterns.
    • train data has a lot of noise(noisy data) so the model learns the patterns of the noise a lot but not the patters of normal data. *Noise(noisy data) means outliers, anomalies or sometimes duplicated data.
    • the training time is too long with a too large number of epochs.
    • the model is too complex.
  • can be mitigated by:
    1. larger train data.
    2. having a lot of various data.
    3. reduceing noise.
    4. shuffling dataset.
    5. stopping training early.
    6. Ensemble learning.
    7. Regularization to reduce model complexity: *Memos:

Underfitting:

  • is the problem which a model cannot make accurate predictions both for train data and new data(including test data) a lot so the model doesn't fit both train data and new data.
  • occurs because:
    • the model is too simple(not complex enough).
    • the training time is too short with a too small number of epochs.
    • Excessive regularization(Dropout, L1 and L2 regularization) is applied.
  • can be mitigated by:
    1. Increasing model complexity.
    2. Increasing the training time with a larger number of epochs.
    3. Decreasing regularization.

Overfitting and Underfitting are trade-off:

Too much overfitting mitigation(5., 6. and 7.) leads to underfitting with high bias and low variance while too much underfitting mitigation(1., 2. and 3.) leads to overfitting with low bias and high variance so their mitigation should be balanced as shown below:

*Memos:

  • You can also say Bias and Variance are trade-off because reducing bias increases variance while reducing variance increases bias so they should be balanced. *Increasing model complexity reduces bias but increases variance while reducing model complexity reduces variance but increases bias.
  • Low bias means high accuracy while high bias means low accuracy.
  • Low variance means high precision while high variance means low precision.

Image description

Image description

Top comments (0)