Abstract
This paper presents a study that compares non-manual markers of polar and wh-questions to statements in Kazakh-Russian Sign Language (KRSL) in a dataset collected for NLP tasks. The primary focus of the study is to demonstrate the utility of computer vision solutions for the linguistic analysis of non-manuals in sign languages, although additional corrections are required to account for biases in the output. To this end, we analyzed recordings of 10 triplets of sentences produced by 9 native signers using both manual annotation and computer vision solutions (such as OpenFace). We utilize and improve the computer vision solution, and briefly describe the results of the linguistic analysis.
Original language | English |
---|---|
Pages | 49-59 |
Number of pages | 11 |
Publication status | Published - 2021 |
Event | 1st International Workshop on Automatic Translation for Signed and Spoken Languages, AT4SSL 2021 - Virtual, Online, United States Duration: Aug 16 2021 → Aug 20 2021 |
Conference
Conference | 1st International Workshop on Automatic Translation for Signed and Spoken Languages, AT4SSL 2021 |
---|---|
Country/Territory | United States |
City | Virtual, Online |
Period | 8/16/21 → 8/20/21 |
ASJC Scopus subject areas
- Language and Linguistics
- Artificial Intelligence
- Software
- Linguistics and Language