Skip to main content
Mallory

Microsoft expands Microsoft 365 Copilot data controls and cross-product data access settings

ai-platform-securityprivacy-surveillance-policystandards-framework-update
Updated March 21, 2026 at 02:19 PM3 sources
Share:
Microsoft expands Microsoft 365 Copilot data controls and cross-product data access settings

Get Ahead of Threats Like This

Know if you're exposed. Before adversaries strike.

Microsoft is tightening and clarifying how Copilot can access and process user and organizational data across the Microsoft ecosystem. Microsoft is expanding Purview Data Loss Prevention (DLP) enforcement so policies that block Copilot from processing restricted/sensitivity-labeled content will apply not only to files in SharePoint and OneDrive, but also to locally stored Word, Excel, and PowerPoint documents. The change is planned for deployment via the Augmentation Loop (AugLoop) Office component between late March and late April 2026, and is expected to be automatically enabled for organizations already configured to block Copilot from processing labeled content; Microsoft says the update works by allowing the Office client/AugLoop to read sensitivity labels directly rather than relying on Microsoft Graph calls tied to SharePoint/OneDrive URLs.

Separately, a Copilot “Memory” setting labeled “Microsoft usage data” has been reported as enabling Copilot to reference data from other Microsoft products (including Bing, MSN, and Edge) to personalize conversations, with an option for users to disable it if they have privacy concerns. A third, unrelated Microsoft 365 issue—an acknowledged bug in classic Outlook that can cause the mouse pointer to disappear—does not materially relate to Copilot data access or DLP controls and appears to be a usability defect rather than a security event.

Timeline

  1. Feb 24, 2026

    Microsoft announces broader DLP enforcement for Copilot across all storage locations

    Microsoft announced it is expanding Microsoft Purview Data Loss Prevention controls so Microsoft 365 Copilot can be blocked from processing sensitivity-labeled Word, Excel, and PowerPoint files regardless of whether they are stored locally, on network drives, in SharePoint Online, or in OneDrive for Business. The change uses Office clients to pass sensitivity label information directly to Copilot's augmentation loop and is set to roll out from late March through late April 2026, automatically enabled for organizations with existing relevant DLP policies.

  2. Feb 24, 2026

    Microsoft discloses Copilot Chat bug affecting protected emails

    Microsoft recently disclosed a Microsoft 365 Copilot Chat bug that for nearly a month allowed Copilot to read and summarize confidential emails in Sent Items and Drafts despite DLP policies and confidentiality labels. Microsoft said the issue only exposed summaries to users already authorized to view the emails, but acknowledged the behavior was not intended.

  3. Feb 24, 2026

    Microsoft enables Copilot Memory to use data from other Microsoft services

    Microsoft introduced a Copilot "Memory" setting that can allow Copilot to reference Microsoft usage data from products such as Bing, MSN, and Edge to personalize conversations. The setting may be enabled automatically, prompting privacy concerns and leading Microsoft to document controls for reviewing, disabling, and deleting memory items.

See the full picture in Mallory

Mallory subscribers get deeper analysis on every story, including:

Impact Assessment

Who’s affected and how

Technical Details

Deep-dive technical analysis

Response Recommendations

Actionable next steps for your team

Indicators of Compromise

IPs, domains, hashes, and more

AI Threads

Ask questions and take action on every story

Advanced Filters

Filter by topic, classification, timeframe

Scheduled Alerts

Get matching stories delivered automatically

Related Stories

Microsoft 365 Copilot Bug Bypassed Email Sensitivity Labels and DLP to Summarize Confidential Messages

Microsoft 365 Copilot Bug Bypassed Email Sensitivity Labels and DLP to Summarize Confidential Messages

Microsoft confirmed a **Microsoft 365 Copilot Chat** bug that caused the Copilot *work tab* to incorrectly read and summarize users’ emails from **Sent Items** and **Drafts**, even when those messages had **confidential sensitivity labels** and organizations had **data loss prevention (DLP)** policies intended to prevent such processing. The issue was tracked as admin-visible incident `CW1226324`, first detected in late January, and was attributed to an unspecified **code error** that allowed labeled items to be picked up by Copilot despite protections being in place. Microsoft said it began rolling out a fix in early February and was still monitoring deployment while validating remediation with a subset of affected users; the company did not disclose how many customers were impacted or provide a firm timeline for full resolution. Separate reporting noted heightened institutional concern about built-in AI features handling sensitive correspondence (e.g., the European Parliament IT department blocking AI features on work devices), underscoring governance and data-handling risk when AI assistants can process content contrary to configured controls.

2 weeks ago
Microsoft Copilot Security and Deployment Controversies

Microsoft Copilot Security and Deployment Controversies

New reporting highlighted a **prompt-injection phishing risk** in Microsoft Copilot after Permiso researchers found that attacker-controlled text embedded in emails could manipulate Copilot-generated summaries through cross-prompt injection attacks. The issue could cause Copilot to present deceptive security alerts or malicious instructions inside a trusted Microsoft 365 interface, increasing the likelihood that users will believe and act on attacker content. Separate coverage also noted broader security and privacy concerns around Microsoft’s AI ecosystem, including criticism of Windows Recall for capturing and storing snapshots of user activity that Copilot can analyze, even after Microsoft added stronger protections following earlier backlash. Microsoft also faced continued scrutiny over how aggressively it is pushing Copilot into user environments. The company temporarily halted plans to **automatically install the Microsoft 365 Copilot app** on eligible Windows systems outside the EEA, though existing installations remain in place and administrators can still deploy it manually. Public criticism of Copilot’s quality and Microsoft’s AI strategy also spilled into the company’s Discord community, where moderation actions against users mocking “Microslop” drew further attention to dissatisfaction with rushed AI integration, privacy concerns, and the perception that Microsoft is forcing AI features into products despite unresolved trust and security issues.

1 months ago
Microsoft Copilot Security Research: Prompt-Injection Phishing Risk and Copilot Studio Audit-Logging Gaps

Microsoft Copilot Security Research: Prompt-Injection Phishing Risk and Copilot Studio Audit-Logging Gaps

Security researchers reported two distinct Microsoft Copilot-related risks: (1) **cross prompt injection** against *Microsoft Copilot* email summarization surfaces that can cause attacker-supplied text in an email to be treated like instructions, shaping the summary into a convincing in-product “security alert” and creating a phishing path that does not rely on attachments or macros; and (2) **audit-logging gaps in Microsoft Copilot Studio** where certain administrative actions for Copilot Studio agents (e.g., around sharing, authentication, logging, and publication) were not consistently recorded in Microsoft 365’s Unified Audit Log, potentially reducing defenders’ ability to detect malicious or unauthorized agent changes. Permiso described how Copilot’s behavior varies across Outlook’s inline *Summarize* experience, the Outlook Copilot pane/add-in, and Teams-based summarization, with the core risk being **trust transfer**—users may treat Copilot output as system-generated even when it is attacker-influenced—and warned that retrieval across Microsoft 365 (Teams/OneDrive/SharePoint) could amplify impact if chained. Datadog Security Labs stated it reported Copilot Studio logging issues to **MSRC**, that Microsoft remediated logging for the affected events by **October 5, 2025**, and that Datadog later observed a **regression** where some events again failed to log consistently, which it also reported to Microsoft.

1 months ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed. Before adversaries strike.