UNI-MB - logo
UMNIK - logo
 
E-viri
Celotno besedilo
Recenzirano
  • DFS: Joint data formatting ...
    Yang, Cheng; Zhao, Yangming; Zhao, Gongming; Xu, Hongli

    Computer networks (Amsterdam, Netherlands : 1999), June 2023, 2023-06-00, Letnik: 229
    Journal Article

    Efficient communication is crucial to Distributed Machine Learning (DML). In this work, we propose an approach jointing Data Formatting and Sparsification (DFS) to optimize the communication in DML systems based on the parameter server framework. By doing so, we can reduce the time to transmit (aggregated) gradients between the parameter server and workers, and consequently the time to complete training jobs. More specifically, in DFS, every worker first tries to derive as many blocks with all-zero gradients as possible via sparsification, and transmits gradients block by block in a streaming fashion. By skipping blocks with all-zero gradients, we can reduce the communication cost for gradient transmission. Different from previous works on optimizing the communication in DML systems, DFS has three distinct features: (i). it dynamically determines the gradient block size; (ii). it takes into consideration both the data transfer from workers to the parameter server and that from the parameter server to workers; and (iii). it jointly optimizes the data formatting and sparsification. In other words, it performs sparsification in the way that helps form more (or larger) all-zero blocks and save more communication cost. By implementing DFS on a real testbed, we find that it can reduce the time to train a ResNet-18 model by 74.12%. Through extensive simulations, we demonstrate that DFS outperforms the state-of-the-art technique, i.e., OmniReduce (Fei et al., 2021), by up to 87.17% in terms of reducing communication cost in DML systems.