E-viri
Recenzirano
Odprti dostop
-
Wang, Qinxuan; Qin, Peinuan; Zhang, Yue; Wei, Xueyao; Gao, Meiguo
IEEE access, 2022, Letnik: 10Journal Article
In this paper, we proposed a "Multi-Level Attention Network" (MLAN), which defines a multi-level structure, including layer, block, and group levels to get hierarchical attention and combines corresponding residual information for better feature extraction. We also constructed a shared mask attention module (SMA) which can significantly reduce the number of parameters compared with conventional attention methods. Based on the MLAN and SMA, we further investigated a variety of information fusion modules for better feature fusion at different levels. We conducted classification task experiments based on the ResNet backbone with different depths, and the experimental results show that our method has a significant performance improvement over the backbone on CIFAR10 and CIFAR100 datasets. Meanwhile, compared with the mainstream attention methods, our MLAN performs better with higher accuracy as well as less parameters and computation complexity. We also visualized some intermediate feature maps and explained why our MLAN performs well.
Vnos na polico
Trajna povezava
- URL:
Faktor vpliva
Dostop do baze podatkov JCR je dovoljen samo uporabnikom iz Slovenije. Vaš trenutni IP-naslov ni na seznamu dovoljenih za dostop, zato je potrebna avtentikacija z ustreznim računom AAI.
Leto | Faktor vpliva | Izdaja | Kategorija | Razvrstitev | ||||
---|---|---|---|---|---|---|---|---|
JCR | SNIP | JCR | SNIP | JCR | SNIP | JCR | SNIP |
Baze podatkov, v katerih je revija indeksirana
Ime baze podatkov | Področje | Leto |
---|
Povezave do osebnih bibliografij avtorjev | Povezave do podatkov o raziskovalcih v sistemu SICRIS |
---|
Vir: Osebne bibliografije
in: SICRIS
To gradivo vam je dostopno v celotnem besedilu. Če kljub temu želite naročiti gradivo, kliknite gumb Nadaljuj.