DEV Community

Cover image for Reprompt: The One-Click Attack Stealing Microsoft Copilot Data
BeyondMachines for BeyondMachines

Posted on • Originally published at beyondmachines.net

Reprompt: The One-Click Attack Stealing Microsoft Copilot Data

Summary

Varonis researchers discovered "Reprompt," a one-click attack on Microsoft Copilot Personal that bypasses safety filters to steal user data through URL parameter injection. The flaw allowed attackers to steal conversation history and personal files by chaining server-side requests even after a user closed the chat.

Take Action:

Advise your team to never click links that pre-fill AI prompts or even huge links which may include code, trackers or other junk. All these can bypass security filters to steal private data. Ensure your AI tools are updated to the latest versions to close known security gaps.


Read the full article on BeyondMachines


This article was originally published on BeyondMachines

Top comments (0)