Every now and again, I receive requests for information about the security, privacy & governance of Microsoft’s Copilot technologies, including how end user conversations are protected, how customer file, emails & chats are NEVER communicated outside of the customer’s Microsoft 365 cloud instance, how Copilot respects Enterprise data classification protections & access controls, & how customer information is NOT used for AI model training.
Here’s documented references to help you validate Microsoft’s enterprise-class focus on protecting our customer’s information when using Copilot:
- How Microsoft Protects the Data of its Customers using Microsoft AI Technologies, including Copilot
- Data Privacy for ALL Copilot use by customers
Any element of Microsoft customer use of M365 Copilot (license) or Copilot Chat (no additional cost) is never shared with other parties & exclusively stored, securely within the customer’s private Microsoft 365 cloud instance. Prompts, responses, and data accessed through Microsoft Graph aren’t used to train foundation LLMs, including those used by Microsoft 365 Copilot.
For more information: Data, Privacy, and Security for Microsoft 365 Copilot- Privacy for Microsoft 365 Copilot (licensed)
Microsoft 365 Copilot has the unique & powerful capability to reason over & provide information from content Microsoft 365. This information is never shared
For more information: - Privacy for Copilot Chat (no additional cost)
Security, privacy, and compliance controls and commitments available for Microsoft 365 Copilot also extend to Microsoft 365 Copilot Chat. Prompts and responses aren’t used to train the underlying foundation models.
For more information: Microsoft 365 Copilot Chat Privacy and Protections
- Privacy for Microsoft 365 Copilot (licensed)
- Classification & Auditing of Microsoft 365 Copilot
Copilot respects Purview classification to protect against inappropriate data leakage & auditing of Copilot useFor more information: How data is protected and audited in Microsoft 365 and Microsoft 365 Copilot - Security for Microsoft 365 Copilot
This is an outline of Microsoft’s approach to securing Microsoft 365 Copilot and provides guidance you can use to strengthen your AI security posture.
For more information: Security for Microsoft 365 Copilot - Responsible Use of ALL Copilot solutions
Microsoft Responsible AI FAQs are intended to help you understand how AI technology works, the choices system owners and users can make that influence system performance and behavior, and the importance of thinking about the whole system, including the technology, the people, and the environment.For more information: FAQ about using AI responsibly in Microsoft 365 Copilot - How Copilot meets Regulatory Compliance requirements
Microsoft continues to adapt and respond to fulfill AI regulatory requirements as they evolve, so we earn and keep the trust of customers, partners, and regulators. Microsoft 365 Copilot provides broad compliance offerings and certifications, including GDPR, ISO 27001, HIPAA, and the ISO 42001 standard for AI management systems. These help support our customers on their compliance journeys, complemented by features such as contractual readiness, built-in information and communication technology risk management, and operational resilience tooling.- Copilot & meeting regulatory requirements: https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy#meeting-regulatory-compliance-requirements
- A note on FBI CJIS regulatory compliance: INFO: Microsoft 365 Copilot & Criminal Justice Information Systems regulatory compliance (CJIS) | Kurt Shintaku’s Blog
