Using Selenium and Python to Identify Who Has Blocked You on QQ Space
This article demonstrates how to employ Python, Selenium, and web‑scraping techniques to log into QQ Space, extract authentication cookies, compute the required g_tk parameter, retrieve the friend list via QQ's API, and automatically detect and blacklist users who have blocked you.
The author describes a step‑by‑step process for building a small Python crawler that discovers which QQ friends have blocked the user on QQ Space. It begins with setting up the Python environment (Python 3.7.4) and installing third‑party libraries such as requests , lxml , threadpool , and selenium .
Login and Cookie Retrieval
def search_cookie():
if not __import__('os').path.exists('cookie_dict.txt'):
get_cookie_json()
with open('cookie_dict.txt', 'r') as f:
cookie = json.load(f)
return cookie
def get_cookie_json():
qq_number = input('请输入qq号:')
password = __import__('getpass').getpass('请输入qq密码:')
from selenium import webdriver
login_url = 'https://i.qq.com/'
chrome_options = Options()
chrome_options.add_argument('--headless')
driver = webdriver.Chrome(options=chrome_options)
driver.get(login_url)
driver.switch_to_frame('login_frame')
driver.find_element_by_xpath('//*[@id="switcher_plogin"]').click()
time.sleep(1)
driver.find_element_by_xpath('//*[@id="u"]').send_keys(qq_number)
driver.find_element_by_xpath('//*[@id="p"]').send_keys(password)
time.sleep(1)
driver.find_element_by_xpath('//*[@id="login_button"]').click()
time.sleep(1)
cookie_list = driver.get_cookies()
cookie_dict = {}
for cookie in cookie_list:
if 'name' in cookie and 'value' in cookie:
cookie_dict[cookie['name']] = cookie['value']
with open('cookie_dict.txt', 'w') as f:
json.dump(cookie_dict, f)
return TrueFinding the Friend List API
By inspecting the network traffic in the QQ Space friend page (using the browser's F12 tools), the author locates the request URL that returns the friend list, which contains a friend field.
Computing the g_tk Encryption Parameter
def get_g_tk():
p_skey = cookie['p_skey']
h = 5381
for i in p_skey:
h += (h << 5) + ord(i)
g_tk = h & 2147483647
return g_tkRetrieving Friends' QQ Numbers
def get_friends_uin(g_tk):
yurl = 'https://user.qzone.qq.com/proxy/domain/r.qzone.qq.com/cgi-bin/tfriend/friend_ship_manager.cgi?'
data = {
'uin': cookie['ptui_loginuin'],
'do': 1,
'g_tk': g_tk
}
url = yurl + urllib.parse.urlencode(data)
res = requests.get(url, headers=headers, cookies=cookie)
r = res.text.split('(')[1].split(')')[0]
friends_list = json.loads(r)['data']['items_list']
friends_uin = []
for f in friends_list:
friends_uin.append(f['uin'])
return friends_uinDetecting Blocked Friends
def get_blacklist(friends):
access_denied = []
yurl = 'https://user.qzone.qq.com/'
for friend in friends:
print("开始检查:" + str(friend))
url = yurl + str(friend)
res = requests.get(url, headers=headers, cookies=cookie)
tip = etree.HTML(res.text).xpath('/html/body/div/div/div[1]/p/text()')
if len(tip) > 0:
print(str(friend) + "把我拉黑了!")
access_denied.append(friend)
return access_deniedAutomated Blacklisting
def pull_black():
global cookie
cookie = search_cookie()
with open('access_denied.txt', 'r') as f:
access_denied = f.readlines()
for fake_friend in access_denied:
fake_friend = fake_friend.split('\n')[0]
yurl = "https://user.qzone.qq.com/proxy/domain/w.qzone.qq.com/cgi-bin/right/cgi_black_action_new?"
g_tk = get_g_tk()
url_data = {'g_tk': g_tk}
data = {
'uin': cookie['ptui_loginuin'],
'action': '1',
'act_uin': fake_friend,
'fupdate': '1',
'qzreferrer': 'https://user.qzone.qq.com/1223411083'
}
url = yurl + urllib.parse.urlencode(url_data)
res = requests.post(url, headers=headers, data=data, cookies=cookie)
print(str(fake_friend) + "已被您拉黑")
print("都拉黑了!解气!!")The article concludes with a personal note expressing frustration at being blocked and encourages readers to run the script to blacklist the offending friends.
Python Programming Learning Circle
A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.