E-viri
Recenzirano
Odprti dostop
-
Yu, Chao; Zhou, Yang; Cui, Xiaolong
Mathematics (Basel), 11/2023, Letnik: 11, Številka: 21Journal Article
With continuous progress in science and technology, a large amount of data are produced in all fields of the world at anytime and anywhere. These data are unmarked and lack marking information, while manual marking is time-consuming and laborious. Herein, this paper introduces a distributed semi-supervised labeling framework. This framework addresses the issue of missing data by proposing an attribute-filling method based on subspace learning. Furthermore, this paper presents a distributed semi-supervised learning strategy that trains sub-models (private models) within each sub-system. Finally, this paper develops a distributed graph convolutional neural network fusion technique with enhanced interpretability grounded on the attention mechanism. This paper assigns weights of importance to the edges of each layer in the graph neural network based on sub-models and public data, thereby enabling distributed and interpretable graph convolutional attention. Extensive experimentation using public datasets demonstrates the superiority of the proposed scheme over other state-of-the-art baselines, achieving a reduction in loss of 50% compared to the original approach.
Vnos na polico
Trajna povezava
- URL:
Faktor vpliva
Dostop do baze podatkov JCR je dovoljen samo uporabnikom iz Slovenije. Vaš trenutni IP-naslov ni na seznamu dovoljenih za dostop, zato je potrebna avtentikacija z ustreznim računom AAI.
Leto | Faktor vpliva | Izdaja | Kategorija | Razvrstitev | ||||
---|---|---|---|---|---|---|---|---|
JCR | SNIP | JCR | SNIP | JCR | SNIP | JCR | SNIP |
Baze podatkov, v katerih je revija indeksirana
Ime baze podatkov | Področje | Leto |
---|
Povezave do osebnih bibliografij avtorjev | Povezave do podatkov o raziskovalcih v sistemu SICRIS |
---|
Vir: Osebne bibliografije
in: SICRIS
To gradivo vam je dostopno v celotnem besedilu. Če kljub temu želite naročiti gradivo, kliknite gumb Nadaljuj.