Artificial Intelligence 9 min read

Tetris AI and Hand‑Play Strategies in the Tencent Geek Challenge – Technical Report

In the Tencent Geek Challenge, Wang Haosheng leveraged a custom Tetris simulator and intensive hand‑play—enhanced with limited AI tools—to clear all 10,000 blocks, achieving 1.179 million points and third place, illustrating that manual strategy outperformed the AI despite extensive competition and technical analysis.

Tencent Cloud Developer
Tencent Cloud Developer
Tencent Cloud Developer
Tetris AI and Hand‑Play Strategies in the Tencent Geek Challenge – Technical Report

In the Tencent Cloud+ Community’s innovative competition "Tencent Geek Challenge | Goose Rose Block", 4,570 participants showcased their technical prowess over a ten‑day period. Among them, top player Wang Haosheng combined hard‑core hand‑play with AI techniques to secure third place.

Expert Commentary

The author notes that both AI and manual play were attempted for scoring. The highest submitted score came from hand‑play, employing a strategy of building a high stack and clearing four lines whenever possible, while occasionally clearing three lines when the piece sequence required it. A local simulator was built to provide various metrics that assisted the entire process.

Participation Path

Wang joined the competition through the largest domestic modern Tetris QQ group, recommended by the group owner (known as "farter"). The group actively researches both modern and classic Tetris AI, having produced pioneering high‑efficiency battle AIs such as misamino and zzztoj, which are still regarded as some of the strongest latency‑free modern Tetris AIs worldwide.

Solution Process

Initially, the author tried manual play but found the game’s mechanics (no lock delay, strict death conditions, no hold, single next piece) too harsh for human performance, especially given the high speed numbers displayed. Consequently, the focus shifted to analyzing the web page’s JavaScript code.

The JavaScript was deliberately unobstructive, even providing guidance toward AI usage. However, after modifying the score submission, the backend rejected the attempt. Console output revealed a long sequence of block actions recorded as a video file, confirming that the backend validates the operation sequence via a special judge: the submitted sequence must match the expected one to receive points.

Further code inspection uncovered the core rules: the game consists of 10,000 blocks with a fixed sequence. The more blocks stacked on the board, the higher the reward for line clears. Therefore, merely surviving to increase length is insufficient; the goal is to achieve the highest possible efficiency—maximizing full‑board clears of four lines and, when necessary, three‑line clears—within the 10,000‑block limit.

Two solution directions emerged: (1) develop an AI to automatically generate high‑scoring strategies (though outcomes are hard to predict), and (2) build a custom simulator to manually execute the solution, allowing rapid estimation of expected scores.

Building the Simulation Platform

The author already had a partially completed modern Tetris AI platform. By slightly modifying the rotation system, a functional simulator was assembled in under an hour. This platform offers highly customizable rule interfaces, enabling quick adaptation to the competition’s requirements.

Both AI and manual approaches were pursued simultaneously. The AI, inspired by a top internal competitor’s 1.39 million‑point algorithm, did not achieve a competitive score due to limited optimization time.

Final Solution

The AI approach remained under‑performing, so the final result relied on manual play. Using the simulator, the author added features such as a rollback system, save functionality, a six‑piece preview (instead of the default single‑piece preview), and timing information for the next piece. Additional statistics and death‑condition checks were also incorporated. Despite these enhancements, the manual effort culminated in completing all 10,000 blocks and attaining a score of 1.179 million points, securing third place.

In retrospect, the author regrets that the AI could not deliver a higher score, but values the experience gained. Future plans include polishing the GeekAI platform to achieve a complete competition finish.

Author Bio

Wang Haosheng – Software Development Engineer. Achievements: 2018 ICPC Qingdao Silver Medal, CCPC Qinhuangdao Silver Medal, Zhejiang Province College Programming Contest Gold Medal.

AlgorithmsimulationAIGame developmentcompetitionTetris
Tencent Cloud Developer
Written by

Tencent Cloud Developer

Official Tencent Cloud community account that brings together developers, shares practical tech insights, and fosters an influential tech exchange community.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.