Blocking Bring Your Own Copilot (BYOC) on work documents
- 13 hours ago
- 7 min read

Introduction
We’ve had BYOD for years. Then came shadow IT, then shadow AI, and really they all come from the same problem: people using personal tools for work while the business either does not see it, does not understand the risk properly, or finds out too late.
That is exactly why this post is about bring your own Copilot, or BYOC.
I care a lot about data compliance, security, and governance, probably more than the average person, so I do get frustrated when something rolls out that is not really secure by default, especially when it involves corporate data. And this is one of those things.
BYOC started rolling out in March 2025, and I still come across organisations that have no idea Microsoft allows users to bring certain personal Microsoft 365 Copilot plans into work apps and use them on work documents unless that behaviour is explicitly blocked.
That is the key point. This is not something you need to turn on first. It is something you need to actively block if you do not want employees using personal Copilot entitlements on corporate content.
This post is here to explain what BYOC actually is, why it matters, what Microsoft allows by default, what the risks look like, and how to block it if it does not fit your organisation’s governance model.
I’m all for safe AI adoption and making informed choices. At the end of the day, the choice is yours. But you should know what is allowed by default before that choice is made for you.
Table of contents
TL;DR
Microsoft added multiple account access to Copilot so users can use certain consumer Copilot seats (such as Microsoft 365 Personal, Family, Copilot Pro, or Microsoft 365 Premium) on the current open work document in supported Microsoft 365 apps. This began rolling out in March 2025 and, for supported commercial tenants, it is effectively available unless you block it through policy.
If you do not want employees using personal Copilot plans on corporate content, block it here:
=> Intune admin centre -> Apps -> Policies for Microsoft 365 apps -> Create or edit a policy -> set "Multiple account access to Copilot for work documents" to Disabled -> Review + Create
When this policy is active with the above setting disabled, users cannot use Copilot on work documents with a Copilot licence from outside the organisation.
What BYOC actually means
BYOC means a user is signed into a Microsoft 365 app with their work account and also with a personal Microsoft account that has Copilot access.
If multiple account access is allowed, that user can use Microsoft 365 Copilot from their personal subscription on the current open work document in supported apps such as Word, Excel, PowerPoint, Outlook, and OneNote. Microsoft documents this as multiple account access to Copilot for work and school documents. This does not mean the personal account gets direct access to company files. Microsoft says file access is still tied to the work account. The personal account only provides the Copilot entitlement. Existing SharePoint, OneDrive, and Teams permissions still apply. Sensitivity labels, access controls, and admin policies still apply. The personal account does not inherit special rights inside the organisation.
It is also limited. To some extent...
If the user only has Microsoft 365 Copilot through a personal or other external plan, they can ask Copilot questions about the current open document and make Copilot-assisted edits, but they do not get access to the organisation’s Microsoft Graph through that entitlement. They also cannot use that personal entitlement to query other files across the tenant the way a properly licensed Microsoft 365 Copilot user can.
So in simple terms:
the personal plan provides the Copilot entitlement
the work account provides the file access
Copilot works on the current open work document
broader organisational access still requires a proper Microsoft 365 Copilot licence from the employer
Why blocking bring your own Copilot is important
The biggest issue is not just the technical design. It is the governance problem it creates.
Allowing personal Copilot plans on work documents blurs the line between personal tooling and corporate tooling. That makes it harder to maintain a clear position on what staff are allowed to use for business data. It also normalises the wider shadow AI problem.
An organisation might say it does not want employees using unapproved AI tools, but then still allow personal Copilot entitlements to be used on work content. That creates a mixed message. Staff may see that as a sign that using personal AI tools for work is broadly acceptable, even when it is not. It also creates inconsistency as one employee may have Microsoft 365 Personal. Another may have Family. Another may have Premium. Someone else may not have anything at all. That means access to AI features on work files can depend on what a person happens to buy as a consumer, which is not a strong or consistent model for corporate data handling. Microsoft’s consumer offering has shifted over time too, including Copilot being added to Personal and Family plans in January 2025 and Microsoft 365 Premium being introduced later in 2025.
For organisations trying to build a controlled and supportable AI strategy, that is a problem.
What Microsoft says about the security model
To be fair, Microsoft does not describe this as uncontrolled access.
Microsoft says that data protection is based on the identity used to access the file. That means enterprise data protection remains tied to the work identity, regardless of which account grants Copilot access. Microsoft also says that settings such as web search follow the policy attached to the identity used to access the file.
Microsoft also reports:
the external account does not get separate rights to the file
Copilot only works with content the work identity can already access
prompts, responses, and Microsoft Graph-grounded data are not used to train foundation models for Microsoft 365 Copilot
actions on work content remain auditable
advanced organisational capabilities still require an actual Microsoft 365 Copilot licence assigned by the organisation

Reference: https://learn.microsoft.com/en-us/microsoft-365/copilot/multiple-account-access#data-protection
In my opinion, sure - those controls matter but they do not remove the governance risk.
Why many organisations should still block it
Even with those protections, a lot of organisations will still have good reasons to block BYOC on work documents.
It weakens the boundary between personal and corporate use
For work data, most organisations want work-approved services, work-owned licensing, and work-defined controls.
BYOC cuts across that.
It makes policy harder to explain
It is much easier to say:
For work content, use work-approved AI only.
That is clearer than trying to explain when personal Copilot is allowed, which apps it works in, what it can do, and where the limits are.
It can slow down proper AI governance
If an organisation wants to roll out AI safely, it should do that through approved licensing, proper training, and clear governance. Allowing personal entitlements into the mix can make that harder.
A safer route is still a route
Even if Microsoft’s implementation is more controlled than random consumer AI tools, it is still another route for personal technology to interact with work content.
For many security and compliance teams, that alone is enough reason to block it.
How to block BYOC on work documents
If you do not want employees using personal Copilot plans on corporate content, you can disable it through a policy in Intune.
Block BYOC in Intune
Open the Intune admin centre.
Go to Apps.
Open Policies for Microsoft 365 apps.
Open an existing policy or create a new one. Note: There are many other settings for Microsoft 365 apps available within this policy area, not just the BYOC setting. Because only one global policy can be assigned to all users, you will need to keep all of the Microsoft 365 app settings you want to manage for those users together in that single policy. In other words, one policy can end up containing multiple different configurations for Microsoft 365 apps.

Give the policy a clear name and description.
Choose the right scope. (You can target all users or specific groups)

Search for:
Multiple account access to Copilot for work documents

Open the setting.
Set it to Disabled.
Click Apply.

Click Next.
Review the policy.
Click Create.
That is all you need to do.
Microsoft reports that if this setting is enabled or not configured, users can use Copilot on work documents with a Copilot licence from outside the organisation. If it is disabled, they cannot. Microsoft also notes that the Copilot button may still appear in the app ribbon for signed-in users, but they will not be able to use Copilot capabilities on the work file.
Don’t stop at one setting
Blocking BYOC is useful, but it should not be the only thing you do.
You should also make sure that:
staff know which AI tools are approved for work
your AI policy is clear and readable
your SharePoint, OneDrive, and Teams permissions are in good shape
your sensitivity labels and DLP controls are working as expected
your broader Copilot and web access settings reflect your risk appetite
Copilot follows the permissions and controls that already exist, so weak data governance will still be a problem whether BYOC is allowed or not.
Conclusion
BYOC on work documents is something more organisations need to be aware of.
It sits right in the middle of BYOD, shadow IT, and shadow AI, and it creates a real governance and security question for any organisation handling corporate data in Microsoft 365. The main issue is not whether Microsoft has put guardrails around it. The main issue is whether you are comfortable allowing personal Copilot plans anywhere near work content.
At the end of the day, the choice is yours. But it should be a deliberate choice, not one made by a default setting you did not realise was there.
If you do not want personal Copilot entitlements used on corporate data, block it.

Comments