Research on Intelligent Translation Method for Short Texts Based on Improved RNN Algorithm

Authors

  • Yan Wang Chinese Language and Literature Department, Northwest Minzu University, Lanzhou 730030, Gansu, China; School of Foreign Languages, Lanzhou University of Arts and Science, Lanzhou 730000, Gansu, China
  • Ying Wang Foreigner Language Department, Lanzhou Technology and Business College, Lanzhou, 730101, Gansu, China

DOI:

https://doi.org/10.5755/j01.itc.53.3.34619

Keywords:

Improved RNN, Short text, Intelligent translation, Attention mechanism, Encoder-decoder

Abstract

As the trend towards internationalization accelerates and communication between countries and peoples becomes more important, the need for language translation becomes more urgent. Machine translation has received much attention as it is more labor and material efficient than human translation. However, current machine translation is still far from being fully automated and of high quality. The CRNN-embed model uses characters as input to the translation model, and proposes a word vector generation method with embedded CRNN, namely CRNN-embed. The model adopts a bidirectional GRU structure and introduces two attention mechanisms, CA-Cross Att and MC-SefAtt. The BLEU value of the CRNN-embed model improved by 2.57 percentage points compared to the baseline system after the attention mechanism was introduced. The BLEU values of the study model were higher than both the RNN-search and RNN-embed models, by 0.43 percentage points and 0.96 percentage points in char1, 2.02 percentage points and 3.06 percentage points in char2, respectively. As the size of the dataset increased, the model’s BLEU values and n-word accuracy also increased, and its translations improved significantly. The accuracy and fluency of this model are higher than those of the traditional neural machine translation model. The study model had better translation results and was superior among similar translation models.

Downloads

Published

2024-09-25

Issue

Section

Articles