Organizations that trust Microsoft 365 Copilot & Azure OpenAI Service can depend on Microsoft to not train any AI models with their data. The following are excerpts from Microsoft’s Data Privacy & Security documentation from both products with written statements addressing this concern.
(Note: The following statements exclusively address “Microsoft 365 Copilot” & “Azure OpenAI Service” & only apply to services obtained from Microsoft. They do not have any bearing on services obtained directly from OpenAI Incorporated.)
AI Security for Microsoft 365 Copilot
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-ai-security
- Protecting data during model training
“Microsoft 365 Copilot uses pretrained LLM models hosted by Microsoft; it doesn’t use Customer Data to train these models. In addition, prompt and grounding data isn’t used to train AI models and is never shared with OpenAI or other third parties.”
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-ai-security#protecting-data-during-model-training - Does Microsoft 365 Copilot use my data to train AI models?
“Prompts, responses, and Customer Data accessed through Microsoft Graph aren’t used to train foundation LLMs, including those used by Microsoft 365 Copilot. Product improvements are driven through techniques such as customer-reported incidents and synthetic prompt generation.”
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-ai-security#does-microsoft-365-copilot-use-my-data-to-train-ai-models
Data, Privacy, and Security for Microsoft 365 Copilot
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy
- Important:
“Prompts, responses, and data accessed through Microsoft Graph aren’t used to train foundation LLMs, including those used by Microsoft 365 Copilot.”
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy - How does Microsoft 365 Copilot use your proprietary organizational data?
“We may use customer feedback, which is optional, to improve Microsoft 365 Copilot, just like we use customer feedback to improve other Microsoft 365 services and Microsoft 365 productivity apps. We don’t use this feedback to train the foundation LLMs used by Microsoft 365 Copilot. Customers can manage feedback through admin controls. For more information, see Manage Microsoft feedback for your organization and Providing feedback about Microsoft Copilot with Microsoft 365 apps.”
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy#how-does-microsoft-365-copilot-use-your-proprietary-organizational-data - Data stored about user interactions with Microsoft 365 Copilot
“When a user interacts with Microsoft 365 Copilot (using apps such as Word, PowerPoint, Excel, OneNote, Loop, or Whiteboard), we store data about these interactions. The stored data includes the user’s prompt and Copilot’s response, including citations to any information used to ground Copilot’s response. We refer to the user’s prompt and Copilot’s response to that prompt as the “content of interactions” and the record of those interactions is the user’s Copilot activity history. For example, this stored data provides users with Copilot activity history in Microsoft 365 Copilot Chat (previously named Business Chat) and meetings in Microsoft Teams. This data is processed and stored in alignment with contractual commitments with your organization’s other content in Microsoft 365. The data is encrypted while it’s stored and isn’t used to train foundation LLMs, including those used by Microsoft 365 Copilot.”
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy
Data, privacy, and security for Azure OpenAI Service
- Important:
“Your prompts (inputs) and completions (outputs), your embeddings, and your training data:- are NOT available to other customers.
- are NOT available to OpenAI.
- are NOT used to improve OpenAI models.
- are NOT used to train, retrain, or improve Azure OpenAI Service foundation models.
- are NOT used to improve any Microsoft or third party products or services without your permission or instruction.
- Your fine-tuned Azure OpenAI models are available exclusively for your use.
- The Azure OpenAI Service is operated by Microsoft as an Azure service; Microsoft hosts the OpenAI models in Microsoft’s Azure environment and the Service does NOT interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API).”
https://learn.microsoft.com/en-us/azure/ai-foundry/responsible-ai/openai/data-privacy
- Generating completions, images or embeddings through inferencing
“The models are stateless: no prompts or generations are stored in the model. Additionally, prompts and generations are not used to train, retrain, or improve the base models.”
https://learn.microsoft.com/en-us/azure/ai-foundry/responsible-ai/openai/data-privacy?tabs=azure-portal#generating-completions-images-or-embeddings-through-inferencing
- Data storage for Azure OpenAI Service features
“Fine-tuned models are exclusively available to the customer whose data was used to create the fine-tuned model, are encrypted at rest (when not deployed for inferencing), and can be deleted by the customer at any time. Training data uploaded for fine-tuning is not used to train, retrain, or improve any Microsoft or third party base models.”
https://learn.microsoft.com/en-us/azure/ai-foundry/responsible-ai/openai/data-privacy?tabs=azure-portal#data-storage-for-azure-openai-service-features
Additionally, Microsoft has always had a documented commitment that not only secures our customer’s private data for their use alone but also ensures that their data is never “used” by Microsoft or any 3rd party in any way. Microsoft does not use your tenant’s data or prompts to train public AI models, and your data is never exposed publicly. These protections are backed by Microsoft’s contractual Data Protection Addendum (DPA) and audited compliance programs.
- Verification: Independent audit reports and certifications (SOC, ISO) are available in the https://aka.ms/SCPortal.
- Tenant Isolation: Copilot for Microsoft 365 runs in a private, tenant-scoped environment and respects existing permissions.
- No Public Model Training: Customer data is used only to deliver services, not for advertising or public AI training.
- Compliance & Governance: Microsoft Purview DLP and related tools help prevent unauthorized sharing.
