Black box AI is being adopted faster than the controls to govern it, says AccountsIQ

20 April 2026

The growing adoption of AI across financial reporting and month-end close is creating a new category of risk for mid-market organisations, one that industry specialists are calling ‘black box AI’.

Gavin McGahey, CTO and Co-founder of AccountsIQ, is urging finance leaders to question how AI is being embedded in their workflows before the pace of adoption jeopardises the level of control finance has over operations.

Black box AI refers to any deployment where outputs cannot be traced or governed. In a finance context this typically shows up as unverifiable recommendations or automation that bypasses normal approval workflows, making it difficult to answer essential audit questions.

"The noise around AI in finance is loud right now, but finance teams cannot afford to adopt technology simply because it is what everyone else is doing in the market," said McGahey. "Every finance leader should be asking their technology provider a straightforward question: if something changes in our reporting, can we see exactly what happened, who approved it and why? If the answer is unclear, that is a problem."

The distinction McGahey draws is between AI that assists in a controlled environment and AI that acts autonomously. Assistive AI flags issues and routes decisions through existing approval structures. Autonomous AI silently decides, leaving teams unable to satisfy even basic audit requirements.

"Trustworthy AI in financial reporting flags and recommends rather than silently deciding," said McGahey. "When it surfaces an exception or makes a coding suggestion, the finance professional should be able to see the source transaction and the rule behind the flag. Every action AI touches should leave a clear record of approvals. Anything less and you are not really in control of your numbers."

In practice, McGahey points to month-end close as the area where well-governed AI delivers the clearest return. By surfacing exceptions earlier in the process, before stakeholder packs go out, teams can close faster with greater confidence in what they are signing off.

"The practical win at month-end is not AI-generated narratives or automated reports,” said McGahey. “It is catching the issues earlier, so there are fewer surprises and less rework at the point when pressure is highest. That is where finance teams actually feel the difference.”

For organisations operating across multiple entities or currencies, AI can also flag unusual movements before they become problems at group level, but only when the underlying model is properly governed.

The concern extends to how dashboards and analytics tools are being marketed. A tool only earns the label finance-grade when a professional can drill from a headline figure down to the underlying transaction and from there to audit evidence.

McGahey concluded: "There is a version of AI in finance that genuinely makes teams more effective and a version that just makes things look faster while quietly undermining the integrity of the numbers. Finance teams have always operated on the principle that if you cannot trace it, you cannot trust it. That has not changed. What has changed is that the technology can now either reinforce that discipline or erode it, and which one happens comes down to how it is built and what your provider is willing to be held accountable for.”