You might not care very much about the prospect of the AI bubble bursting. Surely it's just something for the tech bros of Silicon Valley to worry about - or the wealthy investors who have spent billions of dollars funding development.
Author
- Akhil Bhardwaj
Associate Professor (Strategy and Organisation), School of Management, University of Bath
But as a sector, AI may have become too big to fail. And just as they did after the financial crisis of 2008, taxpayers could be picking up the tab if it collapses.
The financial crisis proved to be very expensive. In the UK, the public cost of bailing out the banks was officially put at £23 billion - roughly equivalent to £700 per taxpayer. In the US, taxpayers stumped up an estimated US$498 billion (£362 billion).
Today, the big AI firms are worth way more than banks, with a combined value exceeding £2 trillion . Many of these companies are interconnected (or entangled) with each other through a complex web of deals and investments worth hundreds of billions of dollars .
And despite a recent study which reports that 95% of generative AI pilots at companies are failing , the public sector is not shy about getting involved. The UK government for example, has said it is going "all in" on AI.
It sees potential benefits in incorporating AI into education, defence and health. It wants to bring AI efficiency to court rooms and passport applications.
So AI is being widely adopted in public services, with a level of integration which make it a critical feature of people's day to day lives.
And this is where it gets risky.
Because the reason for bailing out the banks was that the entire financial system would collapse otherwise. And whether or not you agree with the bailout policy, it is hard to argue that banking is not a crucial part of modern society.
Similarly, the more AI is integrated and entangled into every aspect of our lives, the more essential it becomes to everyone, like a banking system. And the companies which provide the AI capabilities become organisations that our lives depend upon.
Imagine, for example, that your healthcare, your child's education and your personal finances all rely on a fictional AI company called "Eh-Aye". That firm cannot be allowed to collapse, because too much depends on it - and taxpayers would probably find themselves being on the hook if it got into financial difficulties.
Bubble trouble
For the time being though, the money flowing in to AI shows little sign of slowing. Supporters insist that despite the failures, investment is critical . They argue that artificial general intelligence (AGI), the point at which AI acquires human-like cognitive capabilities, will vastly improve our lives .
Others are less optimistic. Commentators including computer scientists Gary Marcus and Richard Sutton have cast doubts on the power of AI to become truly intelligent.
In my own research , I highlight the limitations of large language models (LLMs) when it comes to reasoning. Similar conclusions have been drawn at other universities and even at tech company Apple .
So perhaps the endless expansion of the AI bubble comes down to how strongly the AI pioneers believe in its future. They've gone pretty far with it, so maybe it makes sense for them to go all in, with a pragmatic kind of faith that keeps the bubble growing.
The trouble is that one tech billionaire's act of faith could also be described as a gamble. And it's a gamble they want everyone to join , with taxpayers' money on the table.
So if the gamble fails and the bubble bursts, who would bear the costs ? Would the UK government cut funding from the NHS or siphon money from a cash strapped education sector? Would it bail out pension funds that had over-invested in AI?
One thing is certain. The future being offered by AI firms is not guaranteed. Yet governments and businesses are worried they will miss out if they don't get on board - and there are no safeguards in place to protect taxpayers from the fallout if things go wrong.
![]()
Akhil Bhardwaj does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.