Python‑Based Recoil Compensation for PUBG Using Image Recognition and Mouse Automation
This article explains how to build a Python tool that automatically compensates weapon recoil in PUBG by capturing the screen, recognizing equipment with OpenCV and SSIM, and moving the mouse via pynput, pyautogui, and pydirectinput based on weapon data and user input.
1. Overview
The method uses image recognition to achieve recoil compensation without injecting into the game or reading memory, making it safe from detection.
1.1 Effect
It works by recognizing the current weapon and its attachments, then moving the mouse to counteract recoil.
1.2 Prerequisite Knowledge
Different weapons have varying recoil patterns and fire rates; attachments also affect recoil.
The weapon's Y‑axis movement is fixed while the X‑axis is random, so the program only moves the mouse along the Y‑axis.
1.3 Implementation Principle
Use Python's pynput module to listen to keyboard and mouse events.
Listen for left‑mouse button press to start moving the mouse; release to stop. Listen for keyboard keys (e.g., Tab) to open the inventory and start screen capture.
Capture screenshots with pyautogui . Process images with OpenCV. Compare images using the SSIM algorithm to identify weapons and attachments. Control mouse movement with pydirectinput .
2. Detailed Steps
2.1 pynput Keyboard Listening
<code>import pynput.keyboard as keyboard
# Listen to keyboard
def listen_keybord():
listener = keyboard.Listener(on_press=onPressed, on_release=onRelease)
listener.start()
</code>pynput listeners are asynchronous but can be blocked by long‑running handlers, so asynchronous processing is recommended.
2.2 Event Handling
A c_equipment class encapsulates weapon information. The Tab key is monitored asynchronously to detect equipment.
<code>def onRelease(key):
try:
if '1' == key.char:
c_equipment.switch = 1 # Main weapon 1
elif '2' == key.char:
c_equipment.switch = 2 # Main weapon 2
elif '3' == key.char:
c_equipment.switch = 3 # Pistol (no recoil compensation)
elif '4' == key.char:
c_equipment.switch = 3 # Knife
elif '5' == key.char:
c_equipment.switch = 3 # Grenade
except AttributeError:
if 'tab' == key.name: # Tab key triggers async equipment detection
asyncHandle()
elif 'num_lock' == key.name: # Toggle program on/off
changeOpen()
elif 'shift' == key.name:
c_contants.hold = False
</code>2.3 pyautogui Screenshot
Capture the screen when the equipment panel is open.
pyautogui.screenshot(region=[x, y, w, h])
After capturing, convert the image to grayscale, apply adaptive binarization, and save it.
<code>import pyautogui
def adaptive_binarization(img):
# Adaptive thresholding
maxval = 255
blockSize = 3
C = 5
img2 = cv2.adaptiveThreshold(img, maxval, cv2.ADAPTIVE_THRESH_MEAN_C, cv2.THRESH_BINARY, blockSize, C)
return img2
# Screenshot
def shotCut(x, y, w, h):
im = pyautogui.screenshot(region=[x, y, w, h])
screen = cv2.cvtColor(numpy.asarray(im), cv2.COLOR_BGR2GRAY)
temp = adaptive_binarization(screen)
return temp
def saveScreen():
screen1 = shotCut(1780, 125, 614, 570)
cv2.imwrite("./resource/shotcut/screen.bmp", screen1)
</code>2.4 Asset Preparation
Prepare a collection of reference images for weapon names, stocks, grips, muzzle devices, etc., to be used for template matching.
2.5 Image Cropping
Crop the captured equipment panel to match the size of reference assets, e.g., extracting the weapon name region.
<code># Load previous screenshot
screen = cv2.imread("./resource/shotcut/screen.bmp", 0)
# Crop weapon 1 name area
screenWepon1 = screen[0:40, 45:125]
# Compare with reference directory
w1Name = compareAndGetName(screenWepon1, "./resource/guns/")
</code>2.6 Image Comparison
<code># Compare images to obtain name
def compareAndGetName(screenImg, dir):
content = os.listdir(dir)
name = 'none'
max = 0
for fileName in content:
curWepone = cv2.imread(dir + fileName, 0)
res = calculate_ssim(numpy.asarray(screenImg), numpy.asarray(curWepone))
if max < res and res > 0.5:
max = res
name = str(fileName)[:-4]
return name
</code>SSIM algorithm implementation:
<code>def calculate_ssim(img1, img2):
if not img1.shape == img2.shape:
raise ValueError('Input images must have the same dimensions.')
if img1.ndim == 2:
return ssim(img1, img2)
elif img1.ndim == 3:
if img1.shape[2] == 3:
ssims = []
for i in range(3):
ssims.append(ssim(img1, img2))
return numpy.array(ssims).mean()
elif img1.shape[2] == 1:
return ssim(numpy.squeeze(img1), numpy.squeeze(img2))
else:
raise ValueError('Wrong input image dimensions.')
</code>2.7 Mouse Operation
After identifying the weapon and its attachments, listen for left‑mouse button press and move the mouse downwards according to the recoil pattern.
<code>def moveMouse():
curWepone = getCurrentWepone()
if curWepone.name == 'none':
return
basic = curWepone.basic
speed = curWepone.speed
startTime = round(time.perf_counter(), 3) * 1000
for i in range(curWepone.maxBullets):
if not canFire():
break
holdK = 1.0
if c_contants.hold:
holdK = curWepone.hold
moveSum = int(round(basic[i] * curWepone.k * holdK, 2))
while True:
if moveSum > 10:
pydirectinput.move(xOffset=0, yOffset=10, relative=True)
moveSum -= 10
elif moveSum > 0:
pydirectinput.move(xOffset=0, yOffset=moveSum, relative=True)
moveSum = 0
elapsed = (round(time.perf_counter(), 3) * 1000 - startTime)
if not canFire() or elapsed > (i + 1) * speed + 10:
break
time.sleep(0.01)
</code>The while loop splits the required Y‑axis movement into small steps to avoid excessive screen shaking, matching the game's 86 ms fire interval.
3. The Most Challenging Part
Each weapon has a unique recoil pattern; accurate data must be collected manually in the game's training range.
4. Conclusion
The complete code is uploaded to Gitee; interested readers can clone the repository and experiment.
Repository: https://gitee.com/lookoutthebush/PUBG
Python Programming Learning Circle
A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.