Attention-Based BiGRU Model for Real-time Sign Language Translation Applications

Sam Xuan Nguyen

Abstract


Sign language applications provide an important key to solving communication problems for deaf community and normal hearing people. Current research problem usually focuses on improving communication access between deaf and hearing people. In this study, we consider real-time communication context from deaf to hearing people, and thus we propose an attention-based bidirectional gated recurrent unit (A-BiGRU) model which demonstrates on trading-off among an precision performance, and computational efficiency which includes training time, testing time, and system resources on extended the American Sign Language Gloss (E-ASLG-PC12) dataset. The results shown that our proposal has a significant performance improvement in term of training time, testing time, system resources, comparing to attentionbased bidirectional long-short term memory (A-BiLSTM), and the other moderns of sequence to sequence models. Moreover, precision performance of our proposal model achieve closer to that of the complex architecture, A-BiLSTM. Thus, we believe that our proposed model is a suitable and potential candidate for real-time translation applications as well as and lower computational devices when they solve the communication problems from deaf to normal hearing people direction.

 

Keywords– computational efficiency (CE), attention-based bidirectional gated recurrent unit (A-BiGRU), sign language translation applications (SLTA).






DOI: http://dx.doi.org/10.21553/rev-jec.364

Copyright (c) 2024 REV Journal on Electronics and Communications


ISSN: 1859-378X

Copyright © 2011-2024
Radio and Electronics Association of Vietnam
All rights reserved