Lecture by Vladimir Vapnik in January 2020, part of the MIT Deep Learning Lecture Series.
Slides:
Associated podcast conversation:
Series website:
Playlist:
OUTLINE:
0:00 – Introduction
0:46 – Overview: Complete Statistical Theory of Learning
3:47 – Part 1: VC Theory of Generalization
11:04 – Part 2: Target Functional for Minimization
27:13 – Part 3: Selection of Admissible Set of Functions
37:26 – Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 – Part 5: LUSI Approach in Neural Networks
59:28 – Part 6: Examples of Predicates
1:10:39 – Conclusion
1:16:10 – Q&A: Overfitting
1:17:18 – Q&A: Language
CONNECT:
– If you enjoyed this video, please subscribe to this channel.
– Twitter:
– LinkedIn:
– Facebook:
– Instagram:
source