World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.
Review the Privacy Policy for details about analytics processing.
Dr
University of Bristol
Showing your local timezone
Schedule
Wednesday, November 24, 2021
2:00 PM Europe/London
Seminar location
No geocoded details are available for this content yet.
Recording provided by the organiser.
Format
Recorded Seminar
Recording
Available
Host
Sheffield ML
Seminar location
No geocoded details are available for this content yet.
Deep neural networks (DNNs) with the flexibility to learn good top-layer representations have eclipsed shallow kernel methods without that flexibility. Here, we take inspiration from deep neural networks to develop a new family of deep kernel method. In a deep kernel method, there is a kernel at every layer, and the kernels are jointly optimized to improve performance (with strong regularisation). We establish the representational power of deep kernel methods, by showing that they perform exact inference in an infinitely wide Bayesian neural network or deep Gaussian process. Next, we conjecture that the deep kernel machine objective is unimodal, and give a proof of unimodality for linear kernels. Finally, we exploit the simplicity of the deep kernel machine loss to develop a new family of optimizers, based on a matrix equation from control theory, that converges in around 10 steps.
Laurence Aitchison
Dr
University of Bristol
Contact & Resources
neuro
neuro
The development of the iPS cell technology has revolutionized our ability to study development and diseases in defined in vitro cell culture systems. The talk will focus on Rett Syndrome and discuss t
neuro
Pluripotent cells, including embryonic stem (ES) and induced pluripotent stem (iPS) cells, are used to investigate the genetic and epigenetic underpinnings of human diseases such as Parkinson’s, Alzhe