TY - GEN
T1 - Воспроизведение и регуляризация SCRN модели
AU - Kabdolov, Olzhas
AU - Assylbekov, Zhenisbek
AU - Takhanov, Rustem
N1 - Publisher Copyright:
© 2018 COLING 2018 - 27th International Conference on Computational Linguistics, Proceedings. All rights reserved.
PY - 2018
Y1 - 2018
N2 - We reproduce the Structurally Constrained Recurrent Network (SCRN) model, and then regularize it using the existing widespread techniques, such as naïve dropout, variational dropout, and weight tying. We show that when regularized and optimized appropriately the SCRN model can achieve performance comparable with the ubiquitous LSTM model in language modeling task on English data, while outperforming it on non-English data.
AB - We reproduce the Structurally Constrained Recurrent Network (SCRN) model, and then regularize it using the existing widespread techniques, such as naïve dropout, variational dropout, and weight tying. We show that when regularized and optimized appropriately the SCRN model can achieve performance comparable with the ubiquitous LSTM model in language modeling task on English data, while outperforming it on non-English data.
UR - http://www.scopus.com/inward/record.url?scp=85119441329&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85119441329&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85119441329
T3 - COLING 2018 - 27th International Conference on Computational Linguistics, Proceedings
SP - 1705
EP - 1716
BT - COLING 2018 - 27th International Conference on Computational Linguistics, Proceedings
A2 - Bender, Emily M.
A2 - Derczynski, Leon
A2 - Isabelle, Pierre
PB - Association for Computational Linguistics (ACL)
T2 - 27th International Conference on Computational Linguistics, COLING 2018
Y2 - 20 August 2018 through 26 August 2018
ER -