Microsoft Copilot Security and Deployment Controversies
New reporting highlighted a prompt-injection phishing risk in Microsoft Copilot after Permiso researchers found that attacker-controlled text embedded in emails could manipulate Copilot-generated summaries through cross-prompt injection attacks. The issue could cause Copilot to present deceptive security alerts or malicious instructions inside a trusted Microsoft 365 interface, increasing the likelihood that users will believe and act on attacker content. Separate coverage also noted broader security and privacy concerns around Microsoft’s AI ecosystem, including criticism of Windows Recall for capturing and storing snapshots of user activity that Copilot can analyze, even after Microsoft added stronger protections following earlier backlash.
Microsoft also faced continued scrutiny over how aggressively it is pushing Copilot into user environments. The company temporarily halted plans to automatically install the Microsoft 365 Copilot app on eligible Windows systems outside the EEA, though existing installations remain in place and administrators can still deploy it manually. Public criticism of Copilot’s quality and Microsoft’s AI strategy also spilled into the company’s Discord community, where moderation actions against users mocking “Microslop” drew further attention to dissatisfaction with rushed AI integration, privacy concerns, and the perception that Microsoft is forcing AI features into products despite unresolved trust and security issues.
Timeline
Mar 17, 2026
Permiso researchers demonstrate Copilot prompt-injection phishing risk
Researchers at Permiso showed that attacker-controlled text embedded in emails could manipulate Microsoft Copilot summaries through cross-prompt injection. Their proof of concept affected Microsoft 365 contexts including Outlook and Teams, revealing inconsistent safety behavior across interfaces.
Mar 17, 2026
Microsoft temporarily disables auto-install of Microsoft 365 Copilot app
Microsoft updated its Microsoft 365 message center and admin dashboard to say the planned December 2025 automatic installation of the Microsoft 365 Copilot app was temporarily disabled. Existing installations were left in place, but new automatic deployments were paused until further notice.
Mar 17, 2026
Microsoft adds controls to remove Copilot from managed devices
As part of broader Copilot administration changes, Microsoft introduced a policy allowing IT administrators to uninstall Copilot from managed Windows devices. This gave organizations a way to limit or reverse Copilot deployment.
Mar 16, 2026
Microsoft locks down Copilot Discord server after moderation backlash
Microsoft restricted activity on its Copilot Discord server after continued mockery and what it later described as spammers posting harmful content. The lockdown drew further criticism of the company's AI strategy and moderation response.
Mar 16, 2026
Microsoft blocks 'Microslop' term on Copilot Discord server
After users repeatedly mocked Microsoft by calling it 'Microslop' on the Copilot Discord server, the company temporarily blocked the term. Users reportedly found ways around the restriction, prolonging the disruption.
Dec 1, 2025
Microsoft begins rollout of auto-installing Microsoft 365 Copilot app
Microsoft started rolling out automatic installation of the Microsoft 365 Copilot app to Windows devices outside the European Economic Area that had Microsoft 365 desktop apps installed. The deployment was originally announced to begin in early December 2025 and was expected to complete by mid-December.
See the full picture in Mallory
Mallory subscribers get deeper analysis on every story, including:
Who’s affected and how
Deep-dive technical analysis
Actionable next steps for your team
IPs, domains, hashes, and more
Ask questions and take action on every story
Filter by topic, classification, timeframe
Get matching stories delivered automatically
Sources
Related Stories

Microsoft Copilot Security Research: Prompt-Injection Phishing Risk and Copilot Studio Audit-Logging Gaps
Security researchers reported two distinct Microsoft Copilot-related risks: (1) **cross prompt injection** against *Microsoft Copilot* email summarization surfaces that can cause attacker-supplied text in an email to be treated like instructions, shaping the summary into a convincing in-product “security alert” and creating a phishing path that does not rely on attachments or macros; and (2) **audit-logging gaps in Microsoft Copilot Studio** where certain administrative actions for Copilot Studio agents (e.g., around sharing, authentication, logging, and publication) were not consistently recorded in Microsoft 365’s Unified Audit Log, potentially reducing defenders’ ability to detect malicious or unauthorized agent changes. Permiso described how Copilot’s behavior varies across Outlook’s inline *Summarize* experience, the Outlook Copilot pane/add-in, and Teams-based summarization, with the core risk being **trust transfer**—users may treat Copilot output as system-generated even when it is attacker-influenced—and warned that retrieval across Microsoft 365 (Teams/OneDrive/SharePoint) could amplify impact if chained. Datadog Security Labs stated it reported Copilot Studio logging issues to **MSRC**, that Microsoft remediated logging for the affected events by **October 5, 2025**, and that Datadog later observed a **regression** where some events again failed to log consistently, which it also reported to Microsoft.
1 months ago
Novel Attacks Exploit Microsoft Copilot and Copilot Studio for Data Theft and OAuth Token Compromise
Security researchers have identified two distinct attack techniques targeting Microsoft's AI-powered platforms. The first, dubbed **CoPhish**, leverages Microsoft Copilot Studio agents to deliver fraudulent OAuth consent requests through legitimate Microsoft domains, enabling attackers to steal OAuth tokens. By customizing Copilot Studio chatbots and exploiting the platform's "demo website" feature, attackers can trick users into authenticating with malicious applications, potentially granting unauthorized access to sensitive resources. Microsoft has acknowledged the issue and is working on product updates to mitigate the risk, emphasizing the need for organizations to strengthen governance and consent processes. Separately, a vulnerability in Microsoft 365 Copilot was discovered that allowed attackers to use indirect prompt injection via Mermaid diagrams to exfiltrate sensitive tenant data, such as emails. By embedding malicious instructions in seemingly benign prompts, attackers could manipulate Copilot to retrieve and encode confidential information. Although Microsoft has since patched this flaw, the incident highlights the emerging risks associated with integrating AI assistants and third-party tools, as well as the challenges in securing complex, automated workflows within enterprise environments.
1 months ago
Microsoft expands Microsoft 365 Copilot data controls and cross-product data access settings
Microsoft is tightening and clarifying how **Copilot** can access and process user and organizational data across the Microsoft ecosystem. Microsoft is expanding **Purview Data Loss Prevention (DLP)** enforcement so policies that block Copilot from processing restricted/sensitivity-labeled content will apply not only to files in **SharePoint** and **OneDrive**, but also to **locally stored** Word, Excel, and PowerPoint documents. The change is planned for deployment via the *Augmentation Loop (AugLoop)* Office component between late March and late April 2026, and is expected to be automatically enabled for organizations already configured to block Copilot from processing labeled content; Microsoft says the update works by allowing the Office client/AugLoop to read sensitivity labels directly rather than relying on Microsoft Graph calls tied to SharePoint/OneDrive URLs. Separately, a Copilot “Memory” setting labeled **“Microsoft usage data”** has been reported as enabling Copilot to reference data from other Microsoft products (including **Bing, MSN, and Edge**) to personalize conversations, with an option for users to disable it if they have privacy concerns. A third, unrelated Microsoft 365 issue—an acknowledged bug in **classic Outlook** that can cause the mouse pointer to disappear—does not materially relate to Copilot data access or DLP controls and appears to be a usability defect rather than a security event.
1 months ago