Deep Neural Auction (DNA): End-to-End Optimization of Multi-Objective E‑commerce Advertising Auctions
Deep Neural Auction (DNA) integrates deep learning with mechanism design to end-to-end optimize multi-objective e-commerce ad auctions, preserving incentive compatibility and individual rationality, using differentiable sorting and set encoding, achieving superior revenue, CTR, CVR, and conversion metrics versus GSP variants in Alibaba experiments.
The paper addresses the core problem of auction mechanism design in computational advertising by deeply integrating machine learning and mechanism design. It proposes a Deep Neural Auction (DNA) framework that models the auction using deep neural networks and optimizes multiple stakeholder objectives (users, advertisers, platform) in an end‑to‑end fashion while preserving incentive compatibility.
Abstract In e‑commerce ad systems, traditional auctions such as GSP/VCG focus on a single objective and may be sub‑optimal for multi‑stakeholder optimization. DNA encodes the allocation process into a differentiable neural network, relaxes the non‑differentiable sorting operation with a differentiable operator, and explicitly incorporates incentive‑compatible (IC) and individual‑rational (IR) constraints. Offline and online experiments on Alibaba’s display ad platform show superior performance over GSP, uGSP, and DeepGSP across revenue, click‑through rate, conversion rate, and other metrics.
Problem Modeling The multi‑objective optimization is formulated as a weighted sum of stakeholder metrics subject to IC and IR constraints. Advertiser bids, platform revenue, user engagement, and other KPIs are combined with pre‑defined importance weights. The value‑maximizer advertiser model is adopted, reflecting modern e‑commerce advertisers who prioritize marketing outcomes under budget or cost‑per‑action constraints.
Model Design
1. Set Encoder : A DeepSet‑based encoder processes the unordered candidate ad set, producing a permutation‑invariant representation of the whole auction context.
2. Context‑Aware Rank Score Function : A partially monotone min‑max network takes each ad’s features and the set representation to output a rank score that satisfies monotonicity and invertibility, ensuring IC/IR properties.
3. Differentiable Sorting Engine : Using the NeuralSort technique, the discrete sorting operation is relaxed into a soft permutation matrix parameterized by a temperature hyper‑parameter, enabling gradient‑based training of the entire pipeline.
4. Training Process
• Sample construction uses bids, predicted CTR/CVR, ad and user attributes, and contextual signals, with real user feedback (click, add‑to‑cart, purchase) as supervision.
• Loss consists of (1) a multi‑objective revenue‑oriented term that maximizes expected offline metrics via the relaxed sorting matrix, and (2) a classification‑style loss that aligns the soft permutation with the optimal ranking derived from logged data. Balancing these losses yields stable training and improves both allocation quality and revenue.
Experimental Results
Offline experiments on large-scale datasets compare DNA with GSP, uGSP, and DeepGSP. When optimizing any two objectives (e.g., RPM + X), DNA achieves a superior Pareto front and higher metric trade‑offs. IC properties are evaluated using a regret metric based on bid perturbations, showing that DNA’s regret is near zero compared with non‑IC baselines.
Online A/B tests demonstrate that, under equal RPM levels, DNA improves all other KPIs (CTR, CVR, conversion, etc.) and outperforms GSP across the board, confirming its practical impact on the Alibaba display ad ecosystem.
Conclusion and Outlook The study shows that deep learning can effectively model and optimize auction mechanisms for multi‑stakeholder objectives while preserving economic guarantees. Future directions include designing richer long‑term optimization goals, integrating data‑driven bidding agents with auction agents, and studying the dynamic game between learning auction mechanisms and autonomous bidders.
References (selected) – Wilkens et al., 2017; Zaheer et al., 2017 (DeepSets); Daniels & Velikova, 2010; Grover et al., 2019 (NeuralSort); Lahaie & Pennock, 2007; Bachrach et al., 2014; Zhang et al., 2021; Feng et al., 2019.
Alimama Tech
Official Alimama tech channel, showcasing all of Alimama's technical innovations.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.