In this episode, we discuss the bane of many machine learning algorithms – overfitting. It is also explained why it is an undesirable way to learn and how to combat it via L1 and L2 regularization.
_____________________________
The paper “Regression Shrinkage and Selection via the Lasso” is available here:
Andrej Karpathy’s excellent lecture notes on neural networks and regularization:
The neural network demo is available here:
A playlist with out neural network and deep learning-related videos:
WE WOULD LIKE TO THANK OUR GENEROUS SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE:
Sunil Kim, Vinay S.
Subscribe if you would like to see more of these! –
The thumbnail image background was created by Tony Hisgett (CC BY 2.0). It has undergone recoloring. –
Splash screen/thumbnail design: Felícia Fehér –
Károly Zsolnai-Fehér’s links:
Patreon →
Facebook →
Twitter →
Web →
source