Overview of long short-term memory neural networks

Kamilya Smagulova, Alex James Pappachen

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Citation (Scopus)

Abstract

Long Short-term Memory was designed to avoid vanishing and exploding gradient problems in recurrent neural networks. Over the last twenty years, various modifications of an original LSTM cell were proposed. This chapter gives an overview of basic LSTM cell structures and demonstrates forward and backward propagation within the most widely used configuration called traditional LSTM cell. Besides, LSTM neural network configurations are described.

Original languageEnglish
Title of host publicationModeling and Optimization in Science and Technologies
PublisherSpringer Verlag
Pages139-153
Number of pages15
DOIs
Publication statusPublished - Jan 1 2020

Publication series

NameModeling and Optimization in Science and Technologies
Volume14
ISSN (Print)2196-7326
ISSN (Electronic)2196-7334

    Fingerprint

ASJC Scopus subject areas

  • Modelling and Simulation
  • Medical Assisting and Transcription
  • Applied Mathematics

Cite this

Smagulova, K., & James Pappachen, A. (2020). Overview of long short-term memory neural networks. In Modeling and Optimization in Science and Technologies (pp. 139-153). (Modeling and Optimization in Science and Technologies; Vol. 14). Springer Verlag. https://doi.org/10.1007/978-3-030-14524-8_11