iQIYI AB Testing Platform: Architecture, Workflow, and Statistical Practices
iQIYI’s AB testing platform integrates a layered traffic‑splitting architecture, real‑time SDK and API delivery, log‑replay data collection, and rigorous T‑test statistical analysis to enable fast, reliable product, algorithm, and operations experiments, exemplified by a UI redesign that boosted watch time by 17.85%.
Background: As internet companies diversify their products and services, data‑driven decision making becomes essential. AB testing is a method that uses data metrics to evaluate the impact of product features and operational strategies by comparing two or more experimental groups under the same conditions.
Typical AB testing steps include:
(1) Identify optimization metrics (e.g., conversion rate);
(2) Formulate hypotheses (e.g., change UI interaction);
(3) Create the experiment;
(4) Measure results (e.g., Group A conversion 23%, Group B 11%);
(5) Optimize further or terminate the experiment based on the outcome.
iQIYI applies AB testing in three main categories:
• Algorithm experiments (search, recommendation, advertising) to validate new strategies such as recall or ranking.
• Product‑feature experiments to safely test UI or functional changes on a small traffic slice before full rollout.
• Operations experiments (user acquisition, retention, membership, content placement) to determine the most effective operational tactics.
Splitting Model: Based on Google’s “Overlapping Experiment Infrastructure”, the model consists of domains (vertical traffic partitions), layers (orthogonal sets of experiments), and experiments (traffic allocation and metric configuration). Each domain can contain multiple layers, and each layer contains mutually exclusive experiments.
Architecture Implementation: iQIYI’s platform adds several improvements over common industry solutions:
1) An initialization interface that delivers high‑latency‑sensitive experiment configs directly to the app at startup via a cloud‑control platform, reducing SDK call time.
2) Both SDK‑based real‑time traffic splitting and an API service for precise audience experiments.
3) AB logs are collected via a logging service and AB SDK replay rather than front‑end instrumentation, enabling more reliable data capture.
The platform comprises three core modules: Experiment Management Platform, AB SDK Traffic Splitting, and Effect Evaluation.
Experiment Management Platform handles experiment configuration, indicator management, traffic allocation, and whitelist control. It also provides an AA split feature to ensure experimental groups have no significant baseline differences.
Traffic Splitting offers two delivery methods:
• HTTP service – flexible but incurs network latency on each request.
• JAVA/C++ SDK – periodically pulls experiment configs, performs client‑side splitting, and eliminates network overhead.
Effect Evaluation includes data collection, statistical analysis, and result reporting. Traditional data collection relied on front‑end event tracking, which proved fragile at scale. iQIYI switched to a log‑replay approach using a data‑center SDK, reducing dependency on front‑end instrumentation.
Statistical testing methods discussed are Z‑test (large samples), T‑test (small samples, unknown population variance), and Chi‑square test (goodness‑of‑fit). iQIYI adopts T‑test for its experiments, evaluating metrics such as average watch time and CTR over at least a seven‑day window to determine significance.
Practical Example: A UI redesign of the iQIYI TV channel page was tested. The experimental group (new UI) achieved a 17.85% increase in average full‑episode watch time, confirming the redesign’s success and providing data‑driven justification for rollout.
Future Directions: Incorporate power analysis algorithms into effect evaluation, and add sample‑size estimation tools to guide experiment design.
Overall, the iQIYI AB testing platform demonstrates a comprehensive, data‑centric approach to product experimentation, combining robust traffic‑splitting mechanisms, reliable data collection, and rigorous statistical validation.
iQIYI Technical Product Team
The technical product team of iQIYI
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.