Research on Intelligent Translation Method for Short Texts Based on Improved RNN Algorithm
DOI:
https://doi.org/10.5755/j01.itc.53.3.34619Keywords:
Improved RNN, Short text, Intelligent translation, Attention mechanism, Encoder-decoderAbstract
As the trend towards internationalization accelerates and communication between countries and peoples becomes more important, the need for language translation becomes more urgent. Machine translation has received much attention as it is more labor and material efficient than human translation. However, current machine translation is still far from being fully automated and of high quality. The CRNN-embed model uses characters as input to the translation model, and proposes a word vector generation method with embedded CRNN, namely CRNN-embed. The model adopts a bidirectional GRU structure and introduces two attention mechanisms, CA-Cross Att and MC-SefAtt. The BLEU value of the CRNN-embed model improved by 2.57 percentage points compared to the baseline system after the attention mechanism was introduced. The BLEU values of the study model were higher than both the RNN-search and RNN-embed models, by 0.43 percentage points and 0.96 percentage points in char1, 2.02 percentage points and 3.06 percentage points in char2, respectively. As the size of the dataset increased, the model’s BLEU values and n-word accuracy also increased, and its translations improved significantly. The accuracy and fluency of this model are higher than those of the traditional neural machine translation model. The study model had better translation results and was superior among similar translation models.
Downloads
Published
Issue
Section
License
Copyright terms are indicated in the Republic of Lithuania Law on Copyright and Related Rights, Articles 4-37.