Microsoft Copilot
by Microsoft · Redmond, WA
Microsoft Copilot is an AI assistant integrated into Microsoft 365 Windows and Bing.
Risk Score: 50/100 (Elevated) · 27+ incidents · Legal 100 · Safety 0 · Privacy 78 · Regulatory 60 · Security 0
Risk Score
Apr 27, 2026
Risk Score Breakdown
Legal Risk
Court cases & lawsuits
100/100
Safety Risk
Incidents & harm events
0/100
Privacy Risk
Breaches & GDPR actions
78/100
Regulatory Risk
FTC, EU enforcement
60/100
Security Risk
CVEs & vulnerabilities
0/100
Incident Timeline
27 total incidents · showing 5 most recent
Apr 2026
Researchers found 73 suspected malicious VS Code extensions on the Open VSX repository linked to the GlassWorm information-stealing campaign. Developers who installed them could face data theft.
Apr 2026
A trojanized version of the SumatraPDF reader is being used to target Chinese-speaking users and deploy malware that enables remote access via Microsoft VS Code tunnels. Affected users are those who install the compromised PDF reader.
Apr 2026
A threat actor (UNC6692) used Microsoft Teams to impersonate an IT help desk and trick targets into installing custom SNOW malware. Affected parties are the targeted organizations and users contacted through Teams.
Apr 2026
Anthropic delayed the public release of Project Glasswing, an AI model meant to find software vulnerabilities, and gave early access only to major tech companies including Microsoft. Limited public details are available on any specific breach impact.
Apr 2026
Semantic Engines LLC sued Microsoft Corporation in the U.S. District Court for the Eastern District of Texas (Case 2:26-cv-00339). Limited public details are available about the claims or who may be affected.
Frequently Asked Questions
What is Microsoft Copilot's AI risk score?
Microsoft Copilot has an AI Risk Score of 50/100 (Elevated Risk). This score is calculated from 27+ documented public incidents across legal, safety, privacy, regulatory, and security categories.
Is Microsoft Copilot safe to use?
Microsoft Copilot by Microsoft has a elevated risk profile based on public data. Organizations should review the full incident list and conduct their own due diligence. This score does not constitute legal advice.
Does Microsoft Copilot have lawsuits?
Yes — our public records show 1 court case(s) for Microsoft Copilot, including: Court Case: Semantic Engines LLC v. Microsoft Corporation.
How is the AI Risk Score calculated?
Scores are weighted across 5 categories: Legal (25%), Safety (25%), Privacy (20%), Regulatory (15%), Security (15%). Each incident is scored by severity and type, then decayed based on age. Active lawsuits and fatal incidents do not decay.
Stay ahead of AI risk
Get alerts when Microsoft Copilot risk score changes
New lawsuits, breaches, and regulatory actions — delivered to your inbox.