Overview of long short-term memory neural networks

Kamilya Smagulova, Alex James Pappachen

Research output: Chapter in Book/Report/Conference proceedingChapter

3 Citations (Scopus)

Abstract

Long Short-term Memory was designed to avoid vanishing and exploding gradient problems in recurrent neural networks. Over the last twenty years, various modifications of an original LSTM cell were proposed. This chapter gives an overview of basic LSTM cell structures and demonstrates forward and backward propagation within the most widely used configuration called traditional LSTM cell. Besides, LSTM neural network configurations are described.

Original languageEnglish
Title of host publicationModeling and Optimization in Science and Technologies
PublisherSpringer Verlag
Pages139-153
Number of pages15
DOIs
Publication statusPublished - Jan 1 2020

Publication series

NameModeling and Optimization in Science and Technologies
Volume14
ISSN (Print)2196-7326
ISSN (Electronic)2196-7334

ASJC Scopus subject areas

  • Modelling and Simulation
  • Medical Assisting and Transcription
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Overview of long short-term memory neural networks'. Together they form a unique fingerprint.

Cite this