TY - GEN
T1 - Syllable-aware Neural Language Models
T2 - 2017 Conference on Empirical Methods in Natural Language Processing, EMNLP 2017
AU - Assylbekov, Zhenisbek
AU - Takhanov, Rustem
AU - Myrzakhmetov, Bagdat
AU - Washington, Jonathan N
PY - 2017/1/1
Y1 - 2017/1/1
N2 - Syllabification does not seem to improve word-level RNN language modeling quality when compared to character-based segmentation. However, our best syllable-aware language model, achieving performance comparable to the competitive character-aware model, has 18%–33% fewer parameters and is trained 1.2–2.2 times faster.
AB - Syllabification does not seem to improve word-level RNN language modeling quality when compared to character-based segmentation. However, our best syllable-aware language model, achieving performance comparable to the competitive character-aware model, has 18%–33% fewer parameters and is trained 1.2–2.2 times faster.
UR - http://www.scopus.com/inward/record.url?scp=85060581134&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85060581134&partnerID=8YFLogxK
M3 - Conference contribution
T3 - EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings
SP - 1866
EP - 1872
BT - EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings
PB - Association for Computational Linguistics (ACL)
Y2 - 9 September 2017 through 11 September 2017
ER -