Cookies
We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.
Tel Aviv University
Showing your local timezone
Schedule
Wednesday, October 20, 2021
1:00 AM America/New_York
Recording provided by the organiser.
Domain
Host
van Vreeswijk TNS
Duration
70 minutes
Tali's work emphasized the tradeoff between compression and information preservation. In this talk I will explore this theme in the context of deep learning. Artificial neural networks have recently revolutionized the field of machine learning. However, we still do not have sufficient theoretical understanding of how such models can be successfully learned. Two specific questions in this context are: how can neural nets be learned despite the non-convexity of the learning problem, and how can they generalize well despite often having more parameters than training data. I will describe our recent work showing that gradient-descent optimization indeed leads to 'simpler' models, where simplicity is captured by lower weight norm and in some cases clustering of weight vectors. We demonstrate this for several teacher and student architectures, including learning linear teachers with ReLU networks, learning boolean functions and learning convolutional pattern detection architectures.
Amir Globerson
Tel Aviv University
neuro
Digital Minds: Brain Development in the Age of Technology examines how our increasingly connected world shapes mental and cognitive health. From screen time and social media to virtual interactions, t
neuro
neuro
Alpha synuclein and Lrrk2 are key players in Parkinson's disease and related disorders, but their normal role has been confusing and controversial. Data from acute gene-editing based knockdown, follow