Game Development 21 min read

Practical Lessons from Client Performance Optimization in a Mobile Game Project

This article shares seven practical insights gained from client performance testing and optimization in a mobile basketball game, covering long‑term performance planning, data‑driven decisions, feature toggles, build configurations, realistic testing, careful trade‑off evaluation, and the importance of team communication.

NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
Practical Lessons from Client Performance Optimization in a Mobile Game Project

The author reflects on client performance testing in a mobile basketball game, highlighting personal experiences and valuable lessons that can be transferred across teams, without delving into basic theory or specific tool tutorials.

01 Recognizing the Long‑Term Nature of Performance Optimization – Early attempts to set strict production standards failed, as performance knowledge evolves and project goals (e.g., targeting low‑end Philippine devices) shift, making performance standards fluid and optimization a continuous effort.

02 Adding Data Points for Major Performance Decisions – Logs and lightweight data points differ: logs are verbose and rarely collected post‑release, while targeted data points capture essential device and performance information, enabling informed decisions such as choosing OpenGLES 3.0 and ETC2 compression for low‑end devices.

03 Using Switches for Performance Features – Feature toggles allow developers to enable or disable specific logic during profiling, facilitate rapid testing on low‑end hardware, and provide a safety net for unverified optimizations in production builds.

04 Selecting Appropriate Test Packages – Different build configurations (development build, performance‑test build, performance‑test‑with‑log build) affect frame rates and data accuracy; using non‑development builds yields more realistic performance metrics.

05 Making Test Behavior Close to Real Users – Continuous multi‑round testing, realistic session lengths, and monitoring performance across several matches reveal issues (e.g., memory growth, unexpected function calls) that single‑run tests miss.

06 Cautiously Evaluating Costly Optimizations – Trade‑offs such as pre‑loading assets improve frame stability but increase memory usage and load time; careful measurement and selective application prevent regressions, as illustrated by optimizing ball‑trajectory loading.

07 Synchronizing Optimization Information – Clear communication of performance changes (e.g., disabling background rendering for full‑screen UI) prevents bugs like black screens and reduces the time spent troubleshooting by the wider team.

08 Conclusion – The article summarizes scattered experiences, emphasizing that sharing practical testing and optimization insights can help other teams avoid similar pitfalls.

Performance TestingBuild Configurationmobile gamingteam communicationGame Optimizationclient performanceFeature Toggles
NetEase LeiHuo Testing Center
Written by

NetEase LeiHuo Testing Center

LeiHuo Testing Center provides high-quality, efficient QA services, striving to become a leading testing team in China.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.