DIKUL - logo
E-resources
Full text
  • Zhang, Hongwei; Tao, Meixia; Shi, Yuanming; Bi, Xiaoyan

    ICC 2022 - IEEE International Conference on Communications, 2022-May-16
    Conference Proceeding

    Federated multi-task learning (FMTL) is a promising edge learning framework to fit the data with non-independent and non-identical distribution (non-i.i.d.) by exploiting the correlations of personalized models. In many practical systems, the sensory data distribution in wireless systems is not only heterogeneous but also non-stationary due to the mobility of terminals and the randomness of link connections. The non-stationary heterogeneous data may lead to model divergence and staleness in the training stage and poor accuracy in the inference stage. In this paper, we design an adaptive FMTL framework, which can work in a non-stationary environment. We propose to optimize the model update scheme and cluster splitting scheme in the training stage to accelerate model convergencse when the training data are non-stationary. We further design a low-complexity model selection scheme in both the training and the inference stages to choose the best model for fitting the current data. The proposed framework is validated in two scenarios, linear regression and graph neural network (GNN)-based power control in wireless device-to-device (D2D) networks. Both sets of numerical results demonstrate that the proposed framework can accelerate the model training convergence and reduce the computation complexity while ensuring model accuracy.