UP - logo
E-resources
Full text
Peer reviewed
  • Bilingual Continuous-Space ...
    Rui Wang; Hai Zhao; Bao-Liang Lu; Utiyama, Masao; Sumita, Eiichiro

    IEEE/ACM transactions on audio, speech, and language processing, 2015-July, 2015-7-00, 20150701, Volume: 23, Issue: 7
    Journal Article

    Larger n-gram language models (LMs) perform better in statistical machine translation (SMT). However, the existing approaches have two main drawbacks for constructing larger LMs: 1) it is not convenient to obtain larger corpora in the same domain as the bilingual parallel corpora in SMT; 2) most of the previous studies focus on monolingual information from the target corpora only, and redundant n-grams have not been fully utilized in SMT. Nowadays, continuous-space language model (CSLM), especially neural network language model (NNLM), has been shown great improvement in the estimation accuracies of the probabilities for predicting the target words. However, most of these CSLM and NNLM approaches still consider monolingual information only or require additional corpus. In this paper, we propose a novel neural network based bilingual LM growing method. Compared to the existing approaches, the proposed method enables us to use bilingual parallel corpus for LM growing in SMT. The results show that our new method outperforms the existing approaches on both SMT performance and computational efficiency significantly.