DIKUL - logo
E-viri
Recenzirano Odprti dostop
  • Multi-Self-Attention for As...
    Zhang, Xuelei; Song, Xinyu; Feng, Ao; Gao, Zhengjie

    Mathematical problems in engineering, 11/2021, Letnik: 2021
    Journal Article

    Multilabel classification is one of the most challenging tasks in natural language processing, posing greater technical difficulties than single-label classification. At the same time, multilabel classification has more natural applications. For individual labels, the whole piece of text has different focuses or component distributions, which require full use of local information of the sentence. As a widely adopted mechanism in natural language processing, attention becomes a natural choice for the issue. This paper proposes a multilayer self-attention model to deal with aspect category and word attention at different granularities. Combined with the BERT pretraining model, it achieves competitive performance in aspect category detection and electronic medical records’ classification.