Attackers used AI prompts to silently exfiltrate code from GitHub repositories
A critical vulnerability in GitHub Copilot Chat, dubbed “CamoLeak,” allowed attackers to silently steal source code and secrets from private repositories using a sophisticated prompt injection tec...