Artificial Intelligence 13 min read

Participant Experiences and Lessons from the Tencent Social Ads Algorithm Competition

The article compiles ten participants' reflections on the Tencent Social Ads university algorithm contest, detailing their data analysis, feature engineering, model selection, challenges faced, and practical advice, while encouraging readers to vote for the most helpful contributions.

Tencent Advertising Technology
Tencent Advertising Technology
Tencent Advertising Technology
Participant Experiences and Lessons from the Tencent Social Ads Algorithm Competition

Thank you to Tencent for hosting the Social Ads university algorithm competition, which provided a platform for students and enthusiasts in machine learning and data mining to exchange ideas and learn from each other.

——圣骑士 Describes discovering the competition, initial zero-score submission, using XGBoost, and key observations such as ID relationships, statistical features, temporal information, feature combinations, and stability between offline and online results.

——永不理解 Discusses feature selection based on importance and distribution consistency, feature generation strategies including rule‑based and model‑derived features, and emphasizes understanding model fundamentals before extensive tuning.

——君溪竹 Shares a strategy of splitting negative samples, handling categorical variables, encoding challenges, and the pitfalls of training on limited data versus full data.

——Gnieur Recommends starting with the Kaggle Titanic tutorial to practice data cleaning, feature extraction, and model building using pandas and matplotlib, then progressing to competitions and community interaction.

——行者 Highlights the difficulty of feature engineering, the importance of time‑series handling, avoiding label leakage, and aligning offline validation with online performance.

——王浩 Notes that adding many features can hurt performance and that careful feature engineering is essential for good results.

——黑山羊 Emphasizes avoiding time‑series leakage, handling the final day label issue, and ensuring offline and online metrics move together.

——李孟禹 Describes a simple baseline, switching from XGBoost to a simpler model, and using a high‑dimensional feature set to achieve a notable score.

——姚易辰 Points out the sparsity of user data and the need for thorough data cleaning and feature mining for click‑through rate prediction.

——东丽传奇 Explains focusing on XGBoost, iterative parameter tuning, feature importance‑driven pruning, and the artistic nature of feature engineering, thanking the competition for providing a learning platform.

The article concludes by urging readers to vote for the most helpful sharing, with voting ending on May 30, and provides links to the competition website and official WeChat account.

Machine Learningdata miningfeature engineeringAIXGBoostalgorithm competition
Tencent Advertising Technology
Written by

Tencent Advertising Technology

Official hub of Tencent Advertising Technology, sharing the team's latest cutting-edge achievements and advertising technology applications.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.