×
Laden...

Complete Statistical Theory of Learning (Vladimir Vapnik) | MIT Deep Learning Series

Lecture by Vladimir Vapnik in January 2020, part of the MIT Deep Learning Lecture Series.
Slides: http://bit.ly/2ORVofC
Associated podcast conversation: https://www.youtube.com/watch?v=bQa7hpUpMzM
Series website: https://deeplearning.mit.edu
Playlist: http://bit.ly/deep-learning-playlist

OUTLINE:
0:00 - Introduction
0:46 - Overview: Complete Statistical Theory of Learning
3:47 - Part 1: VC Theory of Generalization
11:04 - Part 2: Target Functional for Minimization
27:13 - Part 3: Selection of Admissible Set of Functions
37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 - Part 5: LUSI Approach in Neural Networks
59:28 - Part 6: Examples of Predicates
1:10:39 - Conclusion
1:16:10 - Q&A: Overfitting
1:17:18 - Q&A: Language

CONNECT:
- If you enjoyed this video, please subscribe to this channel.
- Twitter: https://twitter.com/lexfridman
- LinkedIn: https://www.linkedin.com/in/lexfridman
- Facebook: https://www.facebook.com/lexfridman
- Instagram: https://www.instagram.com/lexfridman

Nächste Episode
S06E30 - Jim Keller: Abstraction Layers from the Atom to the Data Center | AI Podcast Clips
Die Episode sehen

Episoden

Ähnliche Serien (10)

Distractible
Distractible
Nerdstalgic
Nerdstalgic
Morros de nutria
Morros de nutria
A Bit Fruity with Matt Bernstein (Podcast)
A Bit Fruity with Matt Bernstein (Podcast)
SidemenReacts
SidemenReacts
Timcast IRL
Timcast IRL
vastava
vastava
H3 Podcast
H3 Podcast
WOLF Storie che contano
WOLF Storie che contano
Murders Explained
Murders Explained