Overview of long short-term memory neural networks

Kamilya Smagulova, Alex James Pappachen

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Long Short-term Memory was designed to avoid vanishing and exploding gradient problems in recurrent neural networks. Over the last twenty years, various modifications of an original LSTM cell were proposed. This chapter gives an overview of basic LSTM cell structures and demonstrates forward and backward propagation within the most widely used configuration called traditional LSTM cell. Besides, LSTM neural network configurations are described.

Original languageEnglish
Title of host publicationModeling and Optimization in Science and Technologies
PublisherSpringer Verlag
Pages139-153
Number of pages15
DOIs
Publication statusPublished - Jan 1 2020

Publication series

NameModeling and Optimization in Science and Technologies
Volume14
ISSN (Print)2196-7326
ISSN (Electronic)2196-7334

Fingerprint

Long-Term Memory
Memory Term
Recurrent neural networks
Short-Term Memory
Neural Networks
Neural networks
Cell
Configuration
Recurrent Neural Networks
Propagation
Gradient
Demonstrate
Long short-term memory

ASJC Scopus subject areas

  • Modelling and Simulation
  • Medical Assisting and Transcription
  • Applied Mathematics

Cite this

Smagulova, K., & James Pappachen, A. (2020). Overview of long short-term memory neural networks. In Modeling and Optimization in Science and Technologies (pp. 139-153). (Modeling and Optimization in Science and Technologies; Vol. 14). Springer Verlag. https://doi.org/10.1007/978-3-030-14524-8_11

Overview of long short-term memory neural networks. / Smagulova, Kamilya; James Pappachen, Alex.

Modeling and Optimization in Science and Technologies. Springer Verlag, 2020. p. 139-153 (Modeling and Optimization in Science and Technologies; Vol. 14).

Research output: Chapter in Book/Report/Conference proceedingChapter

Smagulova, K & James Pappachen, A 2020, Overview of long short-term memory neural networks. in Modeling and Optimization in Science and Technologies. Modeling and Optimization in Science and Technologies, vol. 14, Springer Verlag, pp. 139-153. https://doi.org/10.1007/978-3-030-14524-8_11
Smagulova K, James Pappachen A. Overview of long short-term memory neural networks. In Modeling and Optimization in Science and Technologies. Springer Verlag. 2020. p. 139-153. (Modeling and Optimization in Science and Technologies). https://doi.org/10.1007/978-3-030-14524-8_11
Smagulova, Kamilya ; James Pappachen, Alex. / Overview of long short-term memory neural networks. Modeling and Optimization in Science and Technologies. Springer Verlag, 2020. pp. 139-153 (Modeling and Optimization in Science and Technologies).
@inbook{e682bed11c2345999e42b94de51afd91,
title = "Overview of long short-term memory neural networks",
abstract = "Long Short-term Memory was designed to avoid vanishing and exploding gradient problems in recurrent neural networks. Over the last twenty years, various modifications of an original LSTM cell were proposed. This chapter gives an overview of basic LSTM cell structures and demonstrates forward and backward propagation within the most widely used configuration called traditional LSTM cell. Besides, LSTM neural network configurations are described.",
author = "Kamilya Smagulova and {James Pappachen}, Alex",
year = "2020",
month = "1",
day = "1",
doi = "10.1007/978-3-030-14524-8_11",
language = "English",
series = "Modeling and Optimization in Science and Technologies",
publisher = "Springer Verlag",
pages = "139--153",
booktitle = "Modeling and Optimization in Science and Technologies",
address = "Germany",

}

TY - CHAP

T1 - Overview of long short-term memory neural networks

AU - Smagulova, Kamilya

AU - James Pappachen, Alex

PY - 2020/1/1

Y1 - 2020/1/1

N2 - Long Short-term Memory was designed to avoid vanishing and exploding gradient problems in recurrent neural networks. Over the last twenty years, various modifications of an original LSTM cell were proposed. This chapter gives an overview of basic LSTM cell structures and demonstrates forward and backward propagation within the most widely used configuration called traditional LSTM cell. Besides, LSTM neural network configurations are described.

AB - Long Short-term Memory was designed to avoid vanishing and exploding gradient problems in recurrent neural networks. Over the last twenty years, various modifications of an original LSTM cell were proposed. This chapter gives an overview of basic LSTM cell structures and demonstrates forward and backward propagation within the most widely used configuration called traditional LSTM cell. Besides, LSTM neural network configurations are described.

UR - http://www.scopus.com/inward/record.url?scp=85064768191&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064768191&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-14524-8_11

DO - 10.1007/978-3-030-14524-8_11

M3 - Chapter

T3 - Modeling and Optimization in Science and Technologies

SP - 139

EP - 153

BT - Modeling and Optimization in Science and Technologies

PB - Springer Verlag

ER -