A/B Testing Process Improvement and Validation Guide
This article outlines a comprehensive A/B testing workflow, covering historical issues, business test process improvements, detailed implementation steps, SQL validation scripts, data verification in analytics platforms, and practical notes to ensure accurate experiment data collection and analysis.
The article begins by acknowledging the abundance of A/B testing introductions and focuses on how to test and validate A/B requirements.
It lists historical problems such as missing requirement documents, unclear business scenarios, inconsistent understanding among developers, incomplete test coverage, and lack of data verification.
To improve the testing workflow, the author proposes a refined process that includes business scenario testing, instrumentation testing, and A/B data verification, illustrated with a flowchart.
Improvement notes: Requirement documents must clearly define business scenarios, A/B experiment IDs, related fields, and validation SQL; all parties (product, development, testing) must align on requirements. Requirement reviews should involve big‑data product representatives.
Requirement review with big‑data product attendance.
Instrumentation documents must specify test fields and associated business scenarios.
Define A/B test scope and related pages or entry points.
If requirements are unconfirmed by the big‑data product, the review fails.
During development, design documents or interface fields should include A/B experiment scenario descriptions, and provide data‑validation SQL or field explanations.
Testing includes designing A/B test case scenarios, using the analytics system (e.g., Sensors Analytics) to verify data correctness, and conducting internal reviews of test case coverage, data reporting, and validation.
Acceptance steps add dedicated reviewers for instrumentation (business product, big‑data product) and QA for test process acceptance, with a strict rule that failed test‑environment acceptance blocks deployment.
The implementation plan details:
1. Instrumentation Testing
A. Verify parameters and field descriptions in requirement documents.
B. Write test SQL based on time windows and trigger conditions.
-- Example test SQL
SELECT page_id, event_xxx, goods_id AS "Reported Product ID", time
FROM EVENTS
WHERE DATE <= '2020-07-17' AND page_id = 10 AND event_xxx = 'pagexxxxxx'
ORDER BY TIME DESC LIMIT 10;C. Follow instrumentation documentation to execute corresponding test steps (note ~1 minute reporting delay).
D. Access the analytics system (hosts configuration may be required) and run custom queries to compare query results with test outcomes.
2. A/B Experiment Testing
A. Requirement documents must specify A/B experiment pages, experiment IDs, and reporting nodes.
B. Development confirms reporting interface parameters.
C. Test steps include navigating to the homepage, enabling developer mode, switching server buckets, and using tools like Fiddler to capture network traffic and locate the _md_data field.
D. Access the analytics system (consider a 1–2 minute reporting delay) and verify that event='ab_event' and either test_name or test_content contain values; inconsistencies indicate reporting issues.
SELECT "$os", "$app_version", zlj_device_id, distinct_id, "$device_id", test_name, test_content, group_id, "$lib", channel_id, event, event_xxxxx, time, page_id, page_xxxx, operation_xxxx
FROM events
WHERE date='2020-08-12' AND time >= '2020-08-12 00:01:11' AND zlj_device_id='66155xxxxxxxxxxxxxxB9A'
ORDER BY zlj_device_id, time DESC;Additional validation steps cover backend response formats, reporting timing, multi‑operation triggers, and version isolation to ensure high‑version releases report correctly while low‑version scenarios do not.
Key takeaways include confirming the backend returns correct data formats, aligning reporting timing with product specifications, verifying consistent reporting across Android and iOS, and ensuring instrumentation scripts are correctly integrated into the main business flow.
转转QA
In the era of knowledge sharing, discover 转转QA from a new perspective.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.