Artificial Intelligence 15 min read

GBA: Global Batch Gradients Aggregation for Search Advertising Training

GBA (Global Batch Gradients Aggregation) introduces a training mode that seamlessly switches between synchronous and asynchronous learning for search‑advertising models by keeping a constant global batch size, using token‑controlled gradient aggregation and staleness management to retain synchronous‑level accuracy while preserving asynchronous efficiency and eliminating manual hyperparameter tuning.

Alimama Tech
Alimama Tech
Alimama Tech
GBA: Global Batch Gradients Aggregation for Search Advertising Training

This paper introduces GBA (Global Batch Gradients Aggregation), a novel training mode that enables seamless switching between synchronous and asynchronous training in search advertising models. By maintaining a consistent global batch size during gradient aggregation, GBA addresses precision loss and parameter divergence issues during mode transitions. The approach combines token-controlled gradient aggregation with staleness management to ensure stability in mixed-resource environments. Experimental results demonstrate GBA achieves synchronization-level accuracy while preserving asynchronous efficiency, with applications in large-scale sparse model training. The framework supports automatic mode switching without hyperparameter adjustments, offering significant advantages for industrial deployment.

Key contributions include convergence analysis proving GBA's theoretical equivalence to synchronous training under specific conditions, engineering implementation details for token management and gradient buffering, and empirical validation across multiple advertising models. The method enables efficient iteration in both high-performance and standard computing environments, reducing dependency on legacy parameters and accelerating model development cycles.

AlibabaMachine Learningdeep learningSearch AdvertisingGBATraining Optimization
Alimama Tech
Written by

Alimama Tech

Official Alimama tech channel, showcasing all of Alimama's technical innovations.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.