Skip to main content
Home

Main navigation

  • Home
  • Series
  • People
  • Depts & Colleges

Main navigation

  • Home
  • Series
  • People
  • Depts & Colleges

Neural Networks and Deep Kernel Shaping

Series
Department of Statistics
Audio Embed
Rapid training of deep neural networks without skip connections or normalization layers using Deep Kernel Shaping.
Using an extended and formalized version of the Q/C map analysis of Pool et al. (2016), along with Neural Tangent Kernel theory, we identify the main pathologies present in deep networks that prevent them from training fast and generalizing to unseen data, and show how these can be avoided by carefully controlling the "shape" of the network's initialization-time kernel function. We then develop a method called Deep Kernel Shaping (DKS), which accomplishes this using a combination of precise parameter initialization, activation function transformations, and small architectural tweaks, all of which preserve the model class. In our experiments we show that DKS enables SGD training of residual networks without normalization layers on Imagenet and CIFAR-10 classification tasks at speeds comparable to standard ResNetV2 and Wide-ResNet models, with only a small decrease in generalization performance. And when using K-FAC as the optimizer, we achieve similar results for networks without skip connections. Our results apply for a large variety of activation functions, including those which traditionally perform very badly, such as the logistic sigmoid. In addition to DKS, we contribute a detailed analysis of skip connections, normalization layers, special activation functions like RELU and SELU, and various initialization schemes, explaining their effectiveness as alternative (and ultimately incomplete) ways of "shaping" the network's initialization-time kernel.

More in this series

View Series
Department of Statistics

Introduction to Advanced Research Computing at Oxford

Andy Gittings and Dai Jenkins, deliver a graduate lecture on Advance Research Computing (ARC).
Previous
Department of Statistics

Joining Bayesian submodels with Markov melding

This seminar explains and illustrates the approach of Markov melding for joint analysis.
Next
Licence
Creative Commons Attribution-Non-Commercial-Share Alike 2.0 UK: England & Wales; http://creativecommons.org/licenses/by-nc-sa/2.0/uk/

Episode Information

Series
Department of Statistics
People
James Martens
Keywords
deep kernel shaping
neural
networks
dks
computing
Department: Department of Statistics
Date Added: 05/04/2022
Duration: 00:55:17

Subscribe

Apple Podcast Video Apple Podcast Audio Audio RSS Feed

Download

Download Audio

Footer

  • About
  • Accessibility
  • Contribute
  • Copyright
  • Contact
  • Privacy
'Oxford Podcasts' Twitter Account @oxfordpodcasts | MediaPub Publishing Portal for Oxford Podcast Contributors | Upcoming Talks in Oxford | © 2011-2022 The University of Oxford