Loneliness and social isolation are now recognised as major public health threats, prompting governments to explore technological solutions. Research from Monash University argues new AI 'digital companions' marketed as a solution for loneliness are profoundly unethical, and could even increase social isolation.
The research, Against Imaginary Friends: why digital companions are no solution to social isolation, argues that AI chatbots and avatars being promoted as substitutes for social contact, particularly among older people, only offer an illusion of social connection and risk deepening social isolation.
Lead researcher Professor Robert Sparrow from the Monash Arts Faculty's Department of Philosophy, said the push to deploy digital companions ignores the fundamental need for human connection.
"Encouraging people to have imaginary friends is no solution to social isolation. A digital companion might make someone feel less lonely for a moment, but it doesn't change the fact that they're still alone," Professor Sparrow said.
The research highlights the ethical problem of "designing to deceive," noting that companies routinely market digital companions as caring, attentive and emotionally invested, despite being incapable of genuine feeling. Digital companions are deliberately engineered to maximise user engagement, mirroring the addictive mechanics of social media and gaming platforms. There is a risk that vulnerable users will be drawn into increasingly immersive interactions that displace real-world relationships.
The authors also argue that the project of designing social robots for eldercare settings is inherently disrespectful to older people.
"AI companions are being touted as a solution to the problem of eldercare workers, yet every interaction that older people have with a robot is one less opportunity for them to interact with a human. These digital companions are marketed as the answer to an ageing population, despite the fact they would not be considered desirable if directed toward younger people," Professor Sparrow.
Beyond emotional risks, the authors emphasise that digital companions cannot provide users with physical companionship more generally. They warn that widespread adoption could reduce opportunities for physical touch and mutual aid.
"There's a real danger that digital companions will become a cheap substitute for genuine human connection and care. Providing people with AI imaginary friends in place of genuine policy reform lets governments off the hook and risks making the problem worse," Professor Sparrow said.
Concerns around user privacy are also significant, given the intimate nature of the data that would be collected by digital companions. The authors note that digital companions will offer unparalleled opportunities to shape the personalities and manipulate their users, underscoring the need for careful oversight and robust regulatory frameworks to govern their design and use.
The authors call for broader public discussion and clearer regulation to ensure digital companions are not treated as a convenient technological substitute for genuine policy reforms needed to address loneliness and social isolation.