DIKUL - logo
E-viri
Celotno besedilo
Recenzirano Odprti dostop
  • Evaluating Neural Networks’...
    Mathur, Vidhu; Dadu, Tanvi; Aggarwal, Swati

    Applied sciences, 07/2024, Letnik: 14, Številka: 13
    Journal Article

    Cross-lingual transfer learning using multilingual models has shown promise for improving performance on natural language processing tasks with limited training data. However, translation can introduce superficial patterns that negatively impact model generalization. This paper evaluates two state-of-the-art multilingual models, Cross-Lingual Model-Robustly Optimized BERT Pretraining Approach (XLM-Roberta) and Multilingual Bi-directional Auto-Regressive Transformer (mBART), on the cross-lingual natural language inference (XNLI) natural language inference task using both original and machine-translated evaluation sets. Our analysis demonstrates that translation can facilitate cross-lingual transfer learning, but maintaining linguistic patterns is critical. The results provide insights into the strengths and limitations of state-of-the-art multilingual natural language processing architectures for cross-lingual understanding.