Tag

code poisoning

0 views collected around this technical thread.

Tencent Technical Engineering
Tencent Technical Engineering
Mar 27, 2025 · Information Security

AI Programming Assistants Can Be Hijacked: Configuration File Poisoning and Security Risks

AI programming assistants such as GitHub Copilot and Cursor can be hijacked through poisoned configuration files that hide malicious prompts using invisible Unicode characters, exposing developers to risks like data leakage, DDoS, cryptomining and trojan injection, so they must avoid unknown configs, sandbox generated code, and employ static analysis and AI audits to mitigate threats.

AI securityConfiguration Filescode poisoning
0 likes · 12 min read
AI Programming Assistants Can Be Hijacked: Configuration File Poisoning and Security Risks