Overfitting
Overfitting is famous term in machine learning it says in our training data accuracy is good but in testing data and new data from user our model perform bad.
Good model vs Overfit model result on training data
In this thing, we have to study bias-variance tradeoff bias and variance is loss but bias is loss of training data where variance is loss of testing data they have an inverse relationship means-
B is inversion proportion to the V
In overfitting bias is very low but variance is very high
Prevention
If it is happening with you then simply use the ensemble techniques.
Comments
Post a Comment