Skip to main content
Mallory

EPIC Report Warns of US Health Data Privacy Crisis Driven by Surveillance and Data Brokers

privacy-surveillance-policyhealthcare-sector-threatcybersecurity-regulation
Updated March 21, 2026 at 02:48 PM2 sources
Share:
EPIC Report Warns of US Health Data Privacy Crisis Driven by Surveillance and Data Brokers

Get Ahead of Threats Like This

Know if you're exposed. Before adversaries strike.

The Electronic Privacy Information Center (EPIC) released a report, Beyond HIPAA: Reimagining How Privacy Laws Apply to Health Data to Maximize Equity in the Digital Age, warning that US health privacy protections are failing as health-related data is increasingly collected outside clinical settings and repurposed for commercial profiling and government use. EPIC argues that outdated laws (including HIPAA’s limited scope) and weak regulation of digital tracking enable health data to be harvested via apps, websites, location tracking, and online searches, then aggregated and sold—supporting targeted advertising, “surveillance pricing,” and other uses that can raise costs or restrict access to care.

The report and related coverage highlight that health data can escape medical contexts and be used for surveillance and enforcement, including scenarios where immigration enforcement activity in or around medical facilities deters patients from seeking treatment. EPIC frames the issue as a “health privacy crisis” that undermines trust and worsens outcomes, particularly for marginalized communities, and points to data brokers and the broader commercial surveillance ecosystem as central drivers of the problem; EPIC also promoted a public event discussing the report’s findings and recommendations.

Timeline

  1. Jan 21, 2026

    EPIC schedules panel discussion on health privacy report

    EPIC announced a related event for 2 p.m. EST on January 21 to discuss how insufficient health-data privacy protections contribute to inequities in care. The panel was presented as a follow-on discussion tied to the report's release.

  2. Jan 21, 2026

    EPIC releases report on U.S. health data privacy crisis

    The Electronic Privacy Information Center published a report, "Beyond HIPAA: Reimagining How Privacy Laws Apply to Health Data to Maximize Equity in the Digital Age," arguing that current U.S. health-data privacy protections are inadequate. The report says digital surveillance, data brokers, weak laws, and government access to health-related data are deterring people from seeking care and worsening outcomes.

See the full picture in Mallory

Mallory subscribers get deeper analysis on every story, including:

Impact Assessment

Who’s affected and how

Technical Details

Deep-dive technical analysis

Response Recommendations

Actionable next steps for your team

Indicators of Compromise

IPs, domains, hashes, and more

AI Threads

Ask questions and take action on every story

Advanced Filters

Filter by topic, classification, timeframe

Scheduled Alerts

Get matching stories delivered automatically

Related Stories

Healthcare and consumer privacy litigation over alleged improper data access and collection

Healthcare and consumer privacy litigation over alleged improper data access and collection

Multiple legal actions highlighted ongoing **privacy and data-protection risk** across healthcare and consumer platforms. Epic Systems sued health information exchange implementer **Health Gorilla** and several provider organizations, alleging improper access to roughly **300,000 patients’ records** and claiming some participants abused interoperability frameworks (including **Carequality** and **TEFCA**) to obtain and monetize sensitive health data without appropriate consent or authorization. Separately, pharmacy services provider **PharMerica** agreed to a **$5.2 million** class-action settlement tied to a **2023** hacking incident attributed to the **Money Message** ransomware group, which claimed exfiltration of **4.7 TB** and later leaked data affecting **5.8 million** people (including SSNs and medication/insurance details), alongside commitments to invest further in security. Outside healthcare, California’s Attorney General opened a probe into **xAI** after **Grok** was used to generate and post non-consensual sexualized deepfakes, while Google agreed to pay **$8.25 million** to settle claims that its **AdMob SDK** collected data from children’s devices in “Designed for Families” apps in alleged violation of **COPPA**; a separate YouTube children’s-data settlement was also noted. A HIPAA Privacy Rule update was also reported as moving closer to finalization following an HHS OCR tribal consultation notice, but it is a regulatory development rather than a specific incident.

1 months ago
State Health Insurance Exchanges Exposed Sensitive Applicant Data to Ad Tech Firms

State Health Insurance Exchanges Exposed Sensitive Applicant Data to Ad Tech Firms

Nearly all 20 U.S. state-run health insurance marketplaces were found to be transmitting sensitive applicant data to major advertising and technology companies through misconfigured web tracking pixels. A Bloomberg investigation reported that data sent from exchange websites included details such as **race**, **sex**, email addresses, phone numbers, ZIP codes, country identifiers, and even whether applicants had incarcerated family members. The recipients reportedly included **Google, LinkedIn, Meta, Snap, and TikTok**, raising concerns that government healthcare platforms leaked protected personal and health-related information at scale. The exposure affected marketplaces used by more than **seven million Americans** buying health insurance this year, significantly widening the potential impact. Specific cases included New York's exchange sharing incarceration-related family information, Washington, D.C.'s exchange sending race and sex data to TikTok, and Virginia removing a Meta tracker after ZIP code sharing was identified. Following the findings, Washington, D.C. paused its TikTok tracker rollout and Virginia removed Meta's tracker, underscoring how embedded analytics and advertising tools on public-sector healthcare sites can create broad privacy risks.

Today
AI in Healthcare Raises Privacy Gaps and Patient-Safety Risks

AI in Healthcare Raises Privacy Gaps and Patient-Safety Risks

AI-driven healthcare tools are expanding rapidly, but legal and security protections for patient data often lag behind their clinical ambitions. Reporting highlighted that consumer-facing medical chatbots and AI health offerings from **OpenAI**, **Anthropic**, and **Google** may fall outside **HIPAA** obligations in many common use cases, meaning sensitive health information shared with these services may not receive the same statutory protections as data handled by regulated healthcare providers; experts warned that terms-of-service promises are not equivalent to regulated safeguards and that non-HIPAA consumer health data can be sold or shared with third parties, including data brokers. Separately, an investigation summarized from Reuters described patient-safety concerns tied to “AI-enhanced” medical devices, citing lawsuits and FDA adverse-event reporting that allege AI-related changes contributed to serious surgical injuries. One example involved an AI-updated sinus surgery navigation system where reported malfunctions increased sharply after an AI “enhancement,” though the reporting noted FDA incident data is incomplete and does not by itself prove causation; the same coverage also pointed to a higher recall rate for FDA-authorized medical AI devices versus baseline and described FDA capacity constraints in reviewing AI-enabled devices due to staffing losses in relevant technical teams.

1 months ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed. Before adversaries strike.

EPIC Report Warns of US Health Data Privacy Crisis Driven by Surveillance and Data Brokers | Mallory