Game Development 14 min read

Game Testing Practices: Visible and Invisible Aspects, Recording, and Delivery

The article shares a QA newcomer’s one‑year reflection on game testing, covering visible vs invisible aspects, the importance of diff analysis, combining black‑box and white‑box testing, systematic test case recording, bug acceptance, and final delivery practices to improve quality in game development.

NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
Game Testing Practices: Visible and Invisible Aspects, Recording, and Delivery

The author, a QA newcomer after one year at NetEase, reflects on the workflow and methods of game testing, emphasizing both visible (game content) and invisible (implementation logic) aspects, which correspond to black‑box and white‑box testing.

1. Visible and Invisible – Diff First By reviewing version diffs from planners and developers, testers can quickly identify concrete changes, translate each diff into test items, and design test cases that cover both functional details and hidden logic. This helps avoid missing critical features, especially when planner documentation is incomplete.

Resources, scenes, and flow‑chart changes also require careful attention; testers must verify resource references, model updates, and potential cascade effects caused by minor edits.

2. Understanding Implementation Logic Testers need to read code and use GM commands to uncover hidden bugs. Detailed verification of skill effects, damage values, and edge‑case scenarios (e.g., extreme boss attack values) relies on understanding both configuration tables and underlying code.

3. Combining Black‑Box and White‑Box Testing Black‑box testing is useful at the start and end of a feature cycle to catch obvious issues, while white‑box testing targets specific code branches and logic flaws. Group testing and blind testing by other QA members help surface overlooked problems.

4. Test Recording Test cases serve as blueprints; recording results within them shows progress and coverage. Tools such as Xmind for hierarchical case design, Confluence for tables and screenshots, and Excel for data‑driven verification are recommended. Images illustrate example records.

5. Bug Acceptance and Summarization When filing bugs, include reproduction steps, environment details, and screenshots. After fixing, attach verification screenshots and link bugs to the original requirement tickets to streamline tracking.

6. Evolving Test Cases Test cases must be continuously refined as implementation details emerge. Core cases are stored in team repositories (e.g., Xmind files or test‑case platforms) and later distilled into regression suites for version‑day testing.

7. Test Delivery Upon completion, update the Redmine ticket, provide a concise delivery note with key risk warnings, and attach relevant records. This ensures traceability and informs PMs of potential online risks.

Conclusion Game testing blends systematic processes, thorough documentation, and proactive risk communication. By treating testing as a disciplined, recorded activity, QA can lock quality and reduce repetitive, error‑prone work.

QAtest case managementblack-box testinggame testingdiff analysiswhite-box testing
NetEase LeiHuo Testing Center
Written by

NetEase LeiHuo Testing Center

LeiHuo Testing Center provides high-quality, efficient QA services, striving to become a leading testing team in China.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.