Reproducing and Regularizing the SCRN Model

Research output: Contribution to conferencePaperpeer-review


We reproduce the Structurally Constrained Recurrent Network (SCRN) model, and then regularize it using the existing widespread techniques, such as naïve dropout, variational dropout, and
weight tying. We show that when regularized and optimized appropriately the SCRN model can
achieve performance comparable with the ubiquitous LSTM model in language modeling task on
English data, while outperforming it on non-English data.
Original languageEnglish
Number of pages12
Publication statusPublished - Aug 20 2018
EventInternational Conference on Computational Linguistics - Santa Fe, New-Mexico, United States
Duration: Aug 20 2018Aug 26 2018
Conference number: 27


ConferenceInternational Conference on Computational Linguistics
Abbreviated titleCOLING
CountryUnited States
Internet address

Cite this