Prompt Injection
Attacks manipulating AI language models through crafted inputs to bypass safety controls
4Articles
893Topics
Articles with this Tag
SecurityGoogle Gemini Prompt Injection Flaw Let Malicious Calendar Invites Expose Private Meeting Data and Create Deceptive Events
Researchers showed a Gemini prompt injection via Calendar invites that bypassed privacy controls and...
SecurityReprompt attack showed how a single click could silently siphon data from Microsoft Copilot Personal sessions
Security researchers disclosed "Reprompt," a single click prompt injection chain that could quietly ...
SecurityReprompt attack exposed a one-click path to hijack Microsoft Copilot sessions
The Reprompt attack combined URL-prefilled prompts with session persistence to hijack Microsoft Copi...
SecurityShadowLeak and ZombieAgent: Critical ChatGPT Flaws Enable Zero-Click Data Exfiltration from Gmail, Outlook, and GitHub
Security researchers have disclosed critical vulnerabilities in ChatGPT that allowed attackers to si...