As artificial intelligence evolves, deepfakes and misleading digitally created content have become increasingly difficult for people to distinguish from reality, impacting national security, elections and trust in media.
That is why the University of the Sunshine Coast is leading a national research project to identify those most susceptible to believing and sharing political deepfake videos.
Project lead, UniSC Associate Professor Renee Barnes has received a $535,248 National Intelligence and Security Discovery Research Grant , funded by the Australian Government, for the three-year project aimed at protecting people most vulnerable to deception.
"We know some people can't tell the difference, and that can influence their political views and decisions," Dr Barnes said.

"Artificial Intelligence (AI) has enabled the mass creation of these videos that are at the leading edge of video-based disinformation. As we've seen in the US elections, they have the potential to impact Australian electoral outcomes and political processes if left unchecked," she said.
"To have any chance of disrupting mass spreading of political deepfake videos and protecting those most vulnerable to deception, we need to better understand how people behave and react to them," Dr Barnes said.
The research will explore individual motivations and behaviours - down to heartrates and eye movement - to identify which sectors of the community are more likely to believe and share deepfake content.
"Sometimes people know they're fake but share them anyway, and that's something we need to better understand. We also want to know what motivates those who demonstrate pro-social behaviour, such as fact-checking and reporting these videos," Dr Barnes said.

"By building a profile of the demographic, cognitive, emotional and relational factors that make certain groups more vulnerable to deepfakes, we can inform practical interventions - such as education and technological tools - to protect people from deception."
The first stage of the study will involve tracking the reported engagement of approximately 1,200 participants to a random mix of authentic and deep fake videos. Researchers will also collect self-reported data on participants' previous responses to altered content.
Phase Two will use biometric analysis to measure the physical reactions of around 100 participants to deepfake videos - including heart rate and eye tracking - in a specialist lab at Griffith University, a project partner.
The final phase will follow about 30 participants as they interact with online content in their daily lives. An app installed on their phones and desktops will monitor how they engage with digital media-what they share, whether they fact-check, and how they report suspected deepfakes.
Other members of the research project are UniSC Associate Professor Rory Mulcahy , Dr Aimee Riedel (Griffith University) and Dr Lucas Whittaker (Swinburne University).