Microsoft has fixed a vulnerability in its Copilot AI assistant that allowed hackers to pluck a host of sensitive user data with a single click on a URL. The hackers in this case were white hat researchers from security firm Varonis . The net effect of their multistage attack was that they exfiltrated data, including the targets name, location, and details of specific events from the users Copilot chat history. The attack continued to run even when the user closed the Copilot chat, with no...