It is larger than you think — and it is now your responsibility.
Your employees are using AI. Every day, for work-related tasks. With your company data, your client information, your strategic documents. The likelihood that this is fully sanctioned and documented: approximately 29%.
The remaining 71% — the Shadow AI — is invisible to your IT department, undocumented in your governance, and not assessed against the EU AI Act. It is not the exception in Dutch organisations. It is the norm.
Shadow AI existed in 2023. But in 2023 it was primarily: employees using ChatGPT to write texts or produce summaries. Undesirable from a data perspective, but manageable in impact.
In 2026, Shadow AI is fundamentally different in nature. The tools are more powerful — they can not only generate text, but also write code, analyse data, support decisions and execute autonomous actions. Integration is deeper — AI is embedded in the tools employees use daily: email clients, spreadsheet applications, project management platforms. And data exposure is greater — employees unknowingly upload sensitive information to external AI systems because the boundary between their work tool and an AI backend is not visible.
Prohibition does not work. Setting a framework does. That is a board decision, not an IT policy. Organisations that attempt to eliminate Shadow AI through restrictive policy drive usage further underground — and also lose sight of the innovation that employees themselves are generating.
Measure before judging First inventory what is actually being used. Not through a prohibition, but through an anonymous self-assessment among employees. You cannot steer what you cannot see.
Offer alternatives, not prohibitions The most effective way to reduce Shadow AI is to offer sanctioned alternatives that are equally usable. Employees use unofficial AI because the official tools lag behind. Solve that.
Set data policy, not tool policy Rather than specifying which tools may and may not be used — which is impossible to maintain — establish which categories of data may not enter external AI systems. That is enforceable and proportionate.
Record this as a board decision Policy around Shadow AI is a boardroom responsibility — not an IT policy. The board sets the framework; the organisation executes. That distinction is essential for accountability.
Not only insight — but a plan your board can execute.