UNI-MB - logo
UMNIK - logo
 
E-resources
Full text
Peer reviewed
  • Distributed Gaussian Proces...
    Xie, Ang; Yin, Feng; Xu, Yue; Ai, Bo; Chen, Tianshi; Cui, Shuguang

    IEEE signal processing letters, 08/2019, Volume: 26, Issue: 8
    Journal Article

    Hyperparameter optimization still remains the core issue in Gaussian processes (GPs) for machine learning. The classical hyperparameter optimization scheme based on maximum likelihood estimation is impractical for big data processing, as its computational complexity is cubic in terms of the number of data points. With the rapid development of efficient parallel data processing on ever cheaper and more powerful hardware, distributed models and algorithms will become ubiquitous. In this letter, we propose an alternative distributed GP hyperparameter optimization scheme using the efficient proximal alternating direction method of multipliers, proposed by Hong et al. in 2016, and we derive the closed-form solution for the local sub-problems. In contrast to the existing schemes of similar kind, our proposed one well balances the computational load on each local machine and the communication overhead required for global consensus of the local hyperparameter estimates. The proposed scheme can work in either a synchronous or an asynchronous manner, thus very flexible to be adopted in different computing facilities. Experimental results with both synthetic and real datasets validate the outstanding performance of the proposed scheme.