Cyber security researchers have revealed a new supply chain vector dubbed rule file backdoor details that affect Artificial Intelligence (AI) -Poward Code Editors such as Githib Copilot and Cursor, causing them to inject malicious codes.
Pillar Security co-founder and CTO Ziv Carliner said in a technical report shared with hacker news, “This technique enables hackers to compromise the AI-Janita code by injuring hidden malicious instructions in the innocent configuration files used by cursor and githb coopelot,” And CTO Ziv Carliner said in a technical report shared with hacker news.
“By exploiting the unicode characters and sophisticated theft techniques hidden in the model facing the instruction payload, the danger can manipulate actor AI to insert malicious code that bypasses specific code reviews.”
The attack vector is notable to the fact that it allows malicious code to be propagated quietly in projects, leading to the supply chain risk.
The attack hinges on the files of the rules used by AI agents, which to direct their behavior, help users to define the best coding practices and project architecture.
In particular, it appears that the benign rules include embedding carefully prepared indications within the files, causing AI tools to generate security weaknesses or backdoor codes. In other words, the rules of poison produce the nefarious code to AI.
It can be completed by exploiting the ability of AI to hide malicious instructions by using zero-fourth joining, bidish text markers, and other invisible characters, to exploit the AI’s ability to explain the AI’s ability, which is to generate weak code through meaningful patterns that the model tries to overridge moral and safety obstacles.
After disclosure responsible in the end of February and March 2024, both Karsar and Gihub have stated that users are responsible for reviewing and accepting the suggestions generated by equipment.
“Rules File Backdore” represents AI as an attack vector by making itself a weapon, effectively transforms the most reliable assistant of the developer into an unknown partner, possibly affects millions of end users through compromised software, “Carliner said.
“Once a poison rule file is incorporated into a project repository, it affects all future code-generation sessions by team members. In addition, malicious instructions often avoid the forecaging of the project, forming a vector for supply chain attacks that can affect the downstream dependence and final users.”