Learning in Memristive Neural Network Architectures Using Analog Backpropagation Circuits

Olga Krestinskaya, Khaled Nabil Salama, Alex James Pappachen

Research output: Contribution to journalArticlepeer-review

20 Citations (Scopus)

Abstract

The on-chip implementation of learning algorithms would speed up the training of neural networks in crossbar arrays. The circuit level design and implementation of a back-propagation algorithm using gradient descent operation for neural network architectures is an open problem. In this paper, we propose analog backpropagation learning circuits for various memristive learning architectures, such as deep neural network, binary neural network, multiple neural network, hierarchical temporal memory, and long short-term memory. The circuit design and verification are done using TSMC 180-nm CMOS process models and TiO₂-based memristor models. The application level validations of the system are done using XOR problem, MNIST character, and Yale face image databases.

Original languageEnglish
JournalIEEE Transactions on Circuits and Systems I: Regular Papers
DOIs
Publication statusAccepted/In press - Sep 19 2018

Keywords

  • Analog circuits
  • backpropagation
  • binary neural network
  • crossbar
  • deep neural network
  • hierarchical temporal memory
  • learning
  • long-short term memory
  • memristor
  • multiple neural network.

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Learning in Memristive Neural Network Architectures Using Analog Backpropagation Circuits'. Together they form a unique fingerprint.

Cite this