Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.

World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
← Back

Deep Kernel Methods

Back to SeminarsBack
Seminar✓ Recording AvailableNeuroscience

Deep kernel methods

Laurence Aitchison

Dr

University of Bristol

Schedule
Thursday, November 25, 2021

Showing your local timezone

Schedule

Thursday, November 25, 2021

1:00 PM Europe/London

Watch recording
Host: Sheffield ML

Seminar location

Seminar location

Not provided

No geocoded details are available for this content yet.

Watch the seminar

Recording provided by the organiser.

Event Information

Format

Recorded Seminar

Recording

Available

Host

Sheffield ML

Duration

70.00 minutes

Seminar location

Seminar location

Not provided

No geocoded details are available for this content yet.

World Wide map

Abstract

Deep neural networks (DNNs) with the flexibility to learn good top-layer representations have eclipsed shallow kernel methods without that flexibility. Here, we take inspiration from deep neural networks to develop a new family of deep kernel method. In a deep kernel method, there is a kernel at every layer, and the kernels are jointly optimized to improve performance (with strong regularisation). We establish the representational power of deep kernel methods, by showing that they perform exact inference in an infinitely wide Bayesian neural network or deep Gaussian process. Next, we conjecture that the deep kernel machine objective is unimodal, and give a proof of unimodality for linear kernels. Finally, we exploit the simplicity of the deep kernel machine loss to develop a new family of optimizers, based on a matrix equation from control theory, that converges in around 10 steps.

Topics

Gaussian Processesbayesian neural networkcontrolcontrol theorydeep kernel methodsdeep learningdeep neural networksgaussian processkernel methodskernel optimizationlinear kernelsmachine learningneural networksrepresentational powerunimodal objective

About the Speaker

Laurence Aitchison

Dr

University of Bristol

Contact & Resources

Personal Website

www.gatsby.ucl.ac.uk/~laurence/

Related Seminars

Seminar64% match - Relevant

Rethinking Attention: Dynamic Prioritization

neuro

Decades of research on understanding the mechanisms of attentional selection have focused on identifying the units (representations) on which attention operates in order to guide prioritized sensory p

Jan 6, 2025
George Washington University
Seminar64% match - Relevant

The Cognitive Roots of the Problem of Free Will

neuro

Jan 7, 2025
Bielefeld & Amsterdam
Seminar64% match - Relevant

Memory Colloquium Lecture

neuro

Jan 8, 2025
Keio University, Tokyo
World Wide calendar

World Wide highlights

December 2025 • Syncing the latest schedule.

View full calendar
Awaiting featured picks
Month at a glance

Upcoming highlights