Artificial Intelligence 10 min read

Tencent's AutoML Research for Advertising Recommendation Systems

This article outlines Tencent's AutoML research, presenting several recent papers that introduce novel neural architecture search, feature selection, pooling, embedding size, and hyper‑parameter optimization techniques to improve the efficiency, accuracy, and scalability of large‑scale advertising recommendation systems.

Tencent Advertising Technology
Tencent Advertising Technology
Tencent Advertising Technology
Tencent's AutoML Research for Advertising Recommendation Systems

The performance of advertising recommendation systems directly impacts user experience and commercial revenue; addressing data sparsity and cold‑start challenges, Tencent's Machine Learning Platform team has conducted extensive AutoML research and published a series of innovative academic papers.

AutoML (automated machine learning) aims to simplify and automate the model development process by providing tools for feature evaluation, automated architecture search, and hyper‑parameter tuning, thereby lowering the expertise barrier and increasing engineering efficiency.

In the advertising recommendation scenario, AutoML brings high efficiency, resource optimization, and flexibility by automating model selection, hyper‑parameter tuning, and feature engineering, reducing manual effort, accelerating response to user demands, and enabling continuous model updates with interpretability.

Tencent's platform seeks to deliver a universal AutoML capability that streamlines model development, allowing users to quickly build and optimize recommendation models for more precise ad delivery and better user experience.

AutoML Research Highlights

1. BiGNAS (AAAI'25) – Behavior Importance‑Aware Graph Neural Architecture Search for Cross‑Domain Recommendation introduces a cross‑domain customized super‑network and a graph‑based behavior importance perceptron to automatically search optimal GNN architectures and assess source‑domain behavior importance, achieving superior results on benchmark and large‑scale industry datasets.

2. One‑Shot NAS (WWW'23) – Automatic Feature Selection by One‑Shot Neural Architecture Search proposes a framework that builds a candidate feature set and uses a shared‑weight one‑shot NAS to evaluate many feature combinations in a single training run, dramatically reducing computation while selecting the most impactful features for recommendation tasks.

3. AutoPooling (WSDM'24) – Automated Pooling Search for Multi‑valued Features presents an RL/EA‑driven search mechanism that automatically discovers optimal pooling strategies for multi‑valued features, improving representation learning and overall recommendation performance across multiple datasets.

4. AdaS&S – A One‑Shot Supernet Approach for Automatic Embedding Size Search introduces a supernet that trains multiple embedding size configurations simultaneously, enabling fast evaluation and adaptive search of optimal embedding dimensions, leading to significant performance gains on various benchmarks.

5. FlexHB – A More Efficient and Flexible Framework for Hyperparameter Optimization combines Bayesian optimization with Hyperband to dynamically allocate resources, supporting diverse models and tasks while reducing time and computational cost, and demonstrating notable improvements on several benchmark datasets.

Overall, these studies demonstrate how automated techniques such as neural architecture search, feature selection, pooling optimization, embedding size search, and flexible hyperparameter tuning can advance large‑scale advertising recommendation systems.

Machine LearningRecommendation systemsAutoMLNeural Architecture SearchHyperparameter OptimizationEmbedding Size Search
Tencent Advertising Technology
Written by

Tencent Advertising Technology

Official hub of Tencent Advertising Technology, sharing the team's latest cutting-edge achievements and advertising technology applications.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.