Between the middle of the 18th and 19th centuries, the English parliament passed more than 4,000 Enclosure Acts. These laws allowed the fencing of common lands where villagers had grazed livestock and planted for generations, transferring them largely into private ownership of the aristocracy or the church. Similar dramatic changes to the landscape and society occurred in Scotland and Wales around the same time.
Author
- Nana Nwachukwu
PhD Researcher, Centre for AI-Driven Digital Content Technology (Adapt), Trinity College Dublin
According to the economic historian Karl Polanyi , this was a deliberate construction of a new kind of society. One where resources that had sustained communities through mutual access were converted to commodities, forcing people to depend on markets they did not control.
The commoners were not consulted in this decision process. The laws were drafted by landowners and passed by a parliament of property holders.
I have been thinking about Polanyi's analysis as I research my PhD on AI governance and accountability . This is because I believe something similar is happening in the digital space, which my research into the Grok sexualised images controversy shed light on.
In July 2025, users discovered that Grok could generate sexualised images of women with a simple text prompt. Under a post, a user would write "put her in a bikini" and Grok obliged. Even requests for nudity were immediately visible to everyone with access to X.
I began documenting these requests, collecting and categorising more than 565 instances over the last quarter of 2025. To me, the Grok controversy represents the endpoint of a longer withdrawal from the responsibilities that once accompanied control of digital infrastructure.
As a former Trusted Facebook Partner, I am familiar with how content moderation used to work. Platforms such as Meta (when it was Facebook) ran programs where activists and civil society organisations could flag harmful content directly to human reviewers for outright removal or labelling. While these arrangements were imperfect, they were a form of negotiated governance where communities retained input into what stayed and what was taken away.
A year ago, Meta announced it was ending its fact-checking program and moving to "community notes" modelled on X's systems. Users now moderate each other. Meta framed this as a trade-off for free expression. I regard it as a withdrawal of responsibility while retaining control.
In this sense, it mirrors the way the enclosure system enabled landowners first to secure common space for private profit - and then, increasingly, to shirk the responsibilities that were meant to go with this transfer of resources.
Withdrawal of shared governance
Under the old commons system of England, enclosure meant more than fencing land. Lords had duties towards those who worked the land, and the commoners had recognised rights. Even though it was an unequal relationship, it was one negotiated over generations and enforced by local courts.
Enclosure eliminated the commoners' rights while freeing landlords from their reciprocal duties. What was left was control without obligation or care.
But the English enclosure system did not succeed through legal force alone. It required ideological cover. Authors like Arthur Young and Jethro Tull framed enclosure as part of a broader scientific, rational and experimental innovation in agriculture. Newspapers and pamphlets amplified enclosure as a national economic project that would create employment and drive productivity. Today we are experiencing something similar.
AI is often framed as innovative and productivity enhancing - a catalyst for progress, efficiency and problem solving. This has helped big tech establish dominance. It also obscures the fact that controversies such as the Grok scandal are not a momentary failure of innovation, but a natural outcome for the way this technology has been rolled out.
The acceptable use policy of xAI , which owns Grok, states that "you are free to use the service as you like as long as you use it to be a good human". These terms prohibit depicting a person's likeness in a pornographic manner and violating people's right to privacy or publicity, among other things.
These are the rules that users believe the algorithmic fences of AI content will enforce. However, these terms of service are not necessarily written into the system or model behaviour. Only after the major public outcry did xAI announce it had stopped Grok from technically being able to edit the images of real people in a sexualised way.
Big tech not only controls the technology, but the servers where the data we create is stored. Their invisible algorithms determine what surfaces and what disappears. Their terms of service define what speech is permitted. Today's digital version of the enclosures is multidimensional.
In response, it's easy to shrug and say: "Leave and move to a different platform." Such a reply starts to sound like the advice given to people in abusive relationships.
Regulating the landlords
The path forward requires careful planning. Traditional regulatory approaches struggle when corporations are situated in jurisdictions that regard minimal regulation as a competitive advantage .
We need what I call an "authority awareness framework" for engaging with the landlords of these digital enclosures. This would clearly outline who controls which aspects of AI technology, and what can be done to renegotiate how the system is overseen and regulated.
Such a tool would, I believe, support the implementation of the UK's proposed AI regulation bill , by giving the proposed auditing authorities a realistic map of power - not unlike the historical enclosure maps that helped to establish limits on what landlords could do to the English commoners centuries ago.
They didn't get their common land back - but over time they began regulating the landlords. Now we need to do the same with today's digital landlords, and break their stranglehold for good.
![]()
Nana Nwachukwu is on the Advisory Board of the Digital Democracy Initiative and the Tech Project Women's Initiative (TechHerNG). Her PhD research is funded through the AI Accountability Lab at Trinity College Dublin. Nana consults for Saidot Ltd.