Does your executive board have the answers?
In 2024, AI was an agenda item for which the supervisory board received a presentation. In 2025, board members began asking questions about it. In 2026, it has become a structural part of every quarterly review — with questions sharper than most executive boards anticipated.
A survey of 180 supervisory board members in the Netherlands and Belgium (conducted by Governance Research Group, published February 2026) shows that 78% of respondents classify AI as a "significant board-level risk" — an increase of 41% compared to 2024. At the same time, 63% say they are not satisfied with the quality of AI information they receive from the executive board.
That is a gap executive boards must close — and quickly.
"Which AI risks are material to our organisation, and how are they managed?" Not: "are we using AI responsibly?" But: specifically, by risk type, with a description of the control measures in place. Board members who ask this expect an answer they can report to regulators.
"What is our compliance position under the EU AI Act?" Which systems have been inventoried, which are high-risk, and what is the state of conformity? This is no longer a legal nuance — it is a governance requirement for which supervisory board members are held accountable.
"What return is our AI investment generating?" Not: "how many pilots do we have?" But: what concrete value has AI contributed over the past quarters? Supervisory board members who monitor capital effectiveness treat AI like any other investment.
"Who in the executive board is accountable for AI?" Is there an identifiable owner? With what mandate? With what reporting line to the supervisory board? Board members who ask this want to know that governance is not a collective responsibility — which in practice means no one's responsibility.
"How is the executive board preparing the organisation for the labour market impact of AI?" This is the question that surprises most executive boards — because they do not expect it from the supervisory board. But board members who take ESG reporting seriously treat this as part of the social dimension of executive leadership.
Board-level AI literacy is no longer a nice-to-have for supervisory board members. It is a minimum qualification. And executive boards that cannot answer the above questions fluently in the quarterly review will face mounting pressure on their position — not tomorrow, but this year.
The most effective way to improve the supervisory board relationship around AI is through proactivity. Do not wait for the questions to come — bring the information. A two-page quarterly AI memo, addressed to the supervisory board, with a fixed structure (portfolio status, risk overview, compliance update, financial return) communicates three things simultaneously: the executive board has it under control, the supervisory board has the information it needs, and there is a documented governance trail.
That is precisely what regulators and investors look for when they assess the quality of AI governance in your organisation.
Not just insight — but a plan your board can execute.