Multi-layer random features and the approximation power of neural networks

Research output: Contribution to journalConference articlepeer-review

Abstract

A neural architecture with randomly initialized weights, in the infinite width limit, is equivalent to a Gaussian Random Field whose covariance function is the so-called Neural Network Gaussian Process kernel (NNGP). We prove that a reproducing kernel Hilbert space (RKHS) defined by the NNGP contains only functions that can be approximated by the architecture. To achieve a certain approximation error the required number of neurons in each layer is defined by the RKHS norm of the target function. Moreover, the approximation can be constructed from a supervised dataset by a random multi-layer representation of an input vector, together with training of the last layer’s weights. For a 2-layer NN and a domain equal to an n − 1-dimensional sphere in Rn, we compare the number of neurons required by Barron’s theorem and by the multi-layer features construction. We show that if eigenvalues of the integral operator of the NNGP − 2 decay slower than k−n3 where k is an order of an eigenvalue, then our theorem guarantees a more succinct neural network approximation than Barron’s theorem. We also make some computational experiments to verify our theoretical findings. Our experiments show that realistic neural networks easily learn target functions even when both theorems do not give any guarantees.

Original languageEnglish
Pages (from-to)3299-3322
Number of pages24
JournalProceedings of Machine Learning Research
Volume244
Publication statusPublished - 2024
Event40th Conference on Uncertainty in Artificial Intelligence, UAI 2024 - Barcelona, Spain
Duration: Jul 15 2024Jul 19 2024

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Multi-layer random features and the approximation power of neural networks'. Together they form a unique fingerprint.

Cite this