Digital Devices Can Be Weapons in Relationships

According to the World Health Organization, intimate partner violence (IPV) is an international crisis that significantly impacts on the health, wellbeing and safety of people (mostly women).

While technology has become essential part of living a convenient, connected and empowered life, over the last decade, we have seen a substantial increase in the use of technology-facilitated abuse in relationships (known as TAR).

The technology we use every day can be used to facilitate intrusive and controlling behaviour making people feel uneasy, uncomfortable and fearful.

Sadly, the weaponising of our digital media and devices is becoming increasingly common - and we're seeing its effects on the lives of victim-survivors and emerging trends in family violence homicide and filicide.

This is an issue we must take seriously.

BECOMING AWARE

When digital media and devices are co-opted for harm, it compromises our freedom to use technology for work, education, socialising and leisure, as well as managing our health and households.

In some cases, digital devices can become weapons in relationships.

Sound dramatic? If we look at commonly deployed TAR behaviours, you can begin to understand the extent of the harm that can occur in relationships.

These kind of behaviours include (but are not limited to) online tracking and stalking, threats and harassing messages, unauthorised accessing and sharing of personal images and information, doxxing, impersonation, and other actions calculated to humiliate, stalk, coerce and control a partner.

Our recent study has revealed a wide range of technological tools chosen by people who restrict and/or bring harm upon their partner. But encouragingly, we also found there are opportunities to interrupt and prevent TAR from happening in the first place.

Although our awareness of technology-facilitated abuse in relationships is increasing, our knowledge about how or why some people choose to do this is limited.

Most current research on TAR focuses on victim-survivor and practitioner perspectives rather than perpetrators' motivations for choosing this form of abuse.

So, part of the key to preventing and addressing this harm is engaging with those who identify as using TAR in their relationships, even though that might be confronting to do.

THE CHALLENGE OF PERPETRATOR RESEARCH

Some perpetrators view their problematic behaviour as normal.

On top of this, there can be shame attached to admitting to certain acts or even criminal consequences. Unsurprisingly, this can make it hard to recruit research participants.

Despite this, it is vital that we engage with people who perpetrate TAR so we can build our understanding about how to intervene or stop harmful behaviours from occurring in the first place.

That is exactly what our project did.

UNDERSTANDING THE 'WHY' OF TAR

Our team is made up of health care professionals, technologists, data ethicists, sociologists and criminologists who are funded by the Melbourne Social Equity Institute.

We engaged with 35 people who self-identified as having used harmful technological behaviours in their relationships to complete a short story for us. This narrative approach has been used among some 'hard to reach' populations (like those who use abusive behaviours) and for exploring uncomfortable or sensitive topics.

Our study recruited participants who identified as having engaged in harmful behaviours that would have made their partner uneasy, uncomfortable or fearful.

Using hypothetical scenarios, our study participants wrote about how their story characters respond to everyday situations, detailing motives and technology-facilitated actions to monitor, stalk and non-consensually obtain and retain intimate images.

The most common emotion described within stories that triggers someone to use TAR is anger.

This anger ranges from mild to extreme depending on where the participant took the story but is often used to justify the use of TAR.

Other emotions and motivations frequently identified are fear, sadness, and suspicion - all of which were used as provocations to legitimise behaviours like snooping - for example, "[the character] finishes checking the phone for all the different apps that [their partner] can message on".

The need for control in a relationship seems to be another common motivation, and some participants described their characters as inherently controlling people.

This was frequently mentioned and justified when a character says they are suspicious of their partner.

The most frequent TAR behaviours in the stories included snooping, social media checking, location tracking, misusing images and other files, as well as the use of spyware and tracking software.

Encouragingly, our work found there are avenues to interrupt and stop.

One participant said: "it would be better for me to hear the truth from [my partner] rather than reading the messages" and "…upon thinking more about it more thoroughly of how he values and respects [his partner's] privacy, he understands that that's not the right thing to do."

RESPONSIBILITY OF TECH DEVELOPERS

Our research suggests people can be deterred from engaging in TAR and abusive behaviours.

As many of us have experienced, technology developers often use persuasive technologies to encourage us to purchase products and change behaviours around things like exercise and eating habits.

Given this, these companies have the ability - arguably a responsibility - to play a role in protecting their end-users from other users' abusive behaviours.

We urge technology developers to consider how technology can be harnessed to detect and intervene in TAR.

Developing technologies that are capable of recognising, creating and presenting nudges or prompts when 'red flags' appear (like escalating or repeated visits to personal profiles) has real promise.

We also call on governments to embed content on TAR behaviours into education and awareness programs so the wider community can recognise and challenge TAR.

This knowledge will enable the technology industry to develop mechanisms to intervene, interrupt and prevent these harmful behaviours. It will also inform important targeted responses from the technology sector.

Only by understanding the motivations of people who have used TAR - and getting big tech to play a role in protecting victim/survivors - can we prevent it from happening in the first place.

If you need support or more information, please contact 1800 Respect national helpline: 1800 737 732 or Lifeline: 131 114.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.