Dual-level Adaptive Correction for Subgraph-based GNN Training with Efficient Strategy Search
-
Abstract
Graph neural networks (GNNs) have demonstrated remarkable performance across a variety of complex graph-based tasks. Subgraph-wise sampling (SS) methods substantially improve epoch efficiency for large-scale graphs by avoiding the recursive neighbor sampling inherent in node-wise sampling (NS), but incur higher gradient variance, which hinders convergence and accuracy. To address this challenge, we introduce ECHO, a novel framework designed to achieve rapid training acceleration with comparable accuracy to NS. In the preprocessing stage, ECHO employs a strategy search guided by lightweight variance estimation to curtail SS variance. In the training stage, it adaptively incorporates correction epochs to mitigate the residual negative impact of SS variance. To further accelerate training, we present ECHO+, which augments ECHO with an early stopping mechanism within correction, triggered by batch loss dynamics. Empirical evaluations reveal that ECHO attains rapid convergence with high accuracy, delivering up to a 3.5x speedup over NS while preserving comparable performance. Furthermore, ECHO surpasses existing SS baselines, achieving state-of-the-art accuracy and up to 11x faster convergence. The enhanced ECHO+ provides an additional acceleration of up to 3.5x over the base ECHO.
-
-