Chinese Software Testing Professionals Survey: Company Attributes and Practices
Based on a survey of roughly 2,000 Chinese software testing professionals, this report analyses company characteristics, testing‑development ratios, career advancement, training frequency, testing tools, and process metrics, offering insights into current industry trends and recommendations for QA practitioners.
This report presents findings from a survey of about 2,000 Chinese software testing professionals, highlighting key attributes of the companies they work for and offering industry‑wide observations.
1. Software testing domain distribution: Respondents are encouraged to consider internet‑focused companies that work with emerging technologies such as AI, big data, AR/VR, cloud platforms, and fintech for better exposure to new tech.
2. Tester‑developer ratio: The survey links to a related analysis of testing vs. development headcounts and stresses the need to balance product iteration speed, quality, and user experience.
3. Promotion prospects for non‑managerial testers: 61% of participants believe there is little promotion path without management, so most QA staff advance through technical tracks (junior → senior → test expert).
4. Training frequency and content: 49% of companies provide no formal training; among those that do, 1‑2 sessions per year are most common, with internal courses focusing mainly on testing techniques.
5. Types of applications under test: B/S web systems dominate (62%), followed by mobile apps (50%) and C/S systems (34%).
6. Product release cycles: Two‑week and longer‑than‑four‑week cycles are most common (31% and 30% respectively), varying by product type.
7. Factors deciding product delivery: Meeting acceptance criteria is the primary driver (65%), followed by test coverage, user approval, and schedule adherence.
8. Regression effort per round: Over half of companies allocate 10 person‑days for a full regression cycle.
9. Test case volume: 37% of teams manage over 100 cases, with a smaller share handling 10,000+ cases.
10. Test metrics collected: Teams track requirements, case counts, defect counts, schedule variance, and defect attribute analysis to visualize testing effectiveness.
11. Test‑case design approaches: The majority base case creation on thorough requirement analysis and coverage considerations (≈63%).
12. Static testing practices: Project reviews (38%) and no static testing (33%) are the most reported activities.
13. Dynamic testing practices: System testing (68%) and stress, integration, and unit testing are widely used.
14. PC‑side automation tools: Selenium leads (31%), followed by QTP, proprietary tools, and others.
15. Mobile automation tools: Appium (25%) and MonkeyRunner (19%) are the most popular, with many teams using custom solutions.
16. Performance testing tools: LoadRunner dominates (52%) with JMeter holding a significant share (35%).
17. Test management tools: ZenTao (34%) and Jira (23%) are the top choices.
18. Unit testing frameworks: JUnit (31%) and TestNG (10%) are prevalent, reflecting Java’s dominance in test automation.
The article concludes with a preview of the next report focusing on the basic attributes of Chinese software testing professionals.
DevOps Engineer
DevOps engineer, Pythonista and FOSS contributor. Created cpp-linter, commit-check, etc.; contributed to PyPA.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.