For centuries, work with donated bodies has shaped anatomical knowledge and medical training.
Authors
- Jon Cornwall
Senior Lecturer and Education Adviser, University of Otago
- Sabine Hildebrandt
Associate Professor of Pediatrics, Harvard University
Now, digital technologies and artificial intelligence (AI) are reshaping education and we can imagine a future where AI-generated representations of dead people - chatbots specifically developed as "thanabots" - are used to support students' learning.
The term thanabot is derived from thanatology, the study of death. Such AI replicas are already used to assist people during bereavement and could be integrated into medical education.
Thanabots based on information and data from a body donor could interact with students during dissections, providing personalised guidance drawn from medical records, linking clinical history to anatomical findings and improving factual learning.
They might even support the learner's humanistic development through an intensive first encounter with a dead body who comes "alive" through AI.
At this point, thanabots remain hypothetical in educational settings, but the technology exists to make them a reality. At first glance, this looks like an educational breakthrough - a "first patient" brought to virtual life to enhance both anatomical factual learning and the acquisition of skills such as empathy and professionalism in students.
But as we show in our new research , there are many unknown risks associated with the development of such applications that might raise the question of what it actually means to be dead or even "not quite dead".
The evolution of thanabots
Thanabots, also called deadbots or griefbots, already exist. They are, at present, mostly being used as tools to help comfort the bereaved, though thanabots of famous people are also available .
Technologies such as Project December , which simulates text-based conversations with the dead, and Deep Nostalgia , which animates old photos, show how digital afterlives are increasingly represented and even normalised.
Extending these tools to anatomy education seems a logical step. An educational version of a thanabot could answer student questions, guide dissection and provide contextual clinical narratives. These interactions would likely improve clinical reasoning and potentially help students navigate emotionally challenging encounters with the dead.
Yet significant risks accompany such innovation. AI-generated content is prone to error, and incorrectly interpreted medical records or hallucinations about data could mislead students. Also, emotional engagement with a digitally "resurrected" donor could overwhelm learners, or engender unhealthy parasocial attachments.
The illusion of a human presence risks trivialising the body donor's physical reality and could compromise the leaners' authentic encounter with mortality and respect for the deceased.
Cultural norms and individual grief may be disrupted, especially for students already sensitive to exposure to the dead or from backgrounds with strong constraints around postmortem representation.
This includes instances where death and the dead are considered sacred and further engagement with their likeness is considered taboo. In many cultures, the dead should be respectfully left to rest, not "brought back to life".
Risks of using thanabots in anatomy education
The ethical and legal frameworks covering thanabot use are underdeveloped because specific legislation and guidelines are scant or non-existent. This leaves many ethical and legal questions unanswered.
In a scenario where a thanabot were generated for use in anatomy education, who would own a digital donor? How would consent for AI use be obtained from families or estates, medical records ethically managed or privacy and dignity safeguarded?
Any implementation of thanabots would need to address these questions to ensure that potential educational gains don't come at the cost of psychological well-being, ethical integrity or societal unease.
Beyond these practical concerns lies a deeper philosophical issue. What does it mean to be dead in an age of AI "resurrection"?
Anatomy education has long been shaped by societal understanding of mortality and the human body. Use of thanabots might alter these boundaries, blurring the line between life and death, providing representations of something "different" that is neither one nor the other.
Thus, even with the best intentions, students could experience emotional dissonance, confusion about mortality or a distorted understanding of what it means to be human if that understanding is tied to an AI proxy rather than a real person.
We are not suggesting that AI cannot play a role in anatomy education. Carefully designed tools that respect donor dignity, support reflection and augment (not replace) human interaction can enrich learning.
But the allure of technological novelty should not override caution.
Before bringing digital "ghosts" into anatomy laboratories, educators must ensure ethical governance and critically examine what these tools truly teach students about life, death and human dignity.
![]()
The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.