Gradient descent fails to learn high-frequency functions and modular arithmetic

Rustem Takhanov, Maxat Tezekbayev, Artur Pak, Arman Bolatov, Zhenisbek Assylbekov

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Classes of target functions containing a large number of approximately orthogonal elements are known to be hard to learn by the Statistical Query algorithms. Recently this classical fact re-emerged in a theory of gradient-based optimization of neural networks. In the novel framework, the hardness of a class is usually quantified by the variance of the gradient with respect to a random choice of a target function. A set of functions of the form x→axmodp, where a is taken from Zp, has attracted some attention from deep learning theorists and cryptographers recently. This class can be understood as a subset of p-periodic functions on Z and is tightly connected with a class of high-frequency periodic functions on the real line. We present a mathematical analysis of limitations and challenges associated with using gradient-based learning techniques to train a high-frequency periodic function or modular multiplication from examples. We highlight that the variance of the gradient is negligibly small in both cases when either a frequency or the prime base p is large. This in turn prevents such a learning algorithm from being successful.

Original languageEnglish
Article number117
JournalMachine Learning
Volume114
Issue number4
DOIs
Publication statusPublished - Apr 2025

Keywords

  • Barren plateau
  • Gradient-based optimization
  • Hardness of learning
  • High-frequency periodic functions
  • Modular multiplication
  • Statistical query

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Gradient descent fails to learn high-frequency functions and modular arithmetic'. Together they form a unique fingerprint.

Cite this