We use cookies to improve your experience with our site.
Weingram A, Li Y, Qi H et al. xCCL: A survey of industry-led collective communication libraries for deep learning. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 38(1): 166−195 Jan. 2023. DOI: 10.1007/s11390-023-2894-6.
Citation: Weingram A, Li Y, Qi H et al. xCCL: A survey of industry-led collective communication libraries for deep learning. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 38(1): 166−195 Jan. 2023. DOI: 10.1007/s11390-023-2894-6.

xCCL: A Survey of Industry-Led Collective Communication Libraries for Deep Learning

  • Machine learning techniques have become ubiquitous both in industry and academic applications. Increasing model sizes and training data volumes necessitate fast and efficient distributed training approaches. Collective communications greatly simplify inter- and intra-node data transfer and are an essential part of the distributed training process as information such as gradients must be shared between processing nodes. In this paper, we survey the current state-of-the-art collective communication libraries (namely xCCL, including NCCL, oneCCL, RCCL, MSCCL, ACCL, and Gloo), with a focus on the industry-led ones for deep learning workloads. We investigate the design features of these xCCLs, discuss their use cases in the industry deep learning workloads, compare their performance with industry-made benchmarks (i.e., NCCL Tests and PARAM), and discuss key take-aways and interesting observations. We believe our survey sheds light on potential research directions of future designs for xCCLs.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return