How smart home devices are being used in domestic violence situationsSeptember 22, 2020
Brisbane-based family lawyer Kay Feeney has seen a lot when it comes to domestic violence, dealing with forms of abuse that are both visible and invisible. Clients have come into her practice, Feeney Family Law, having been physically or sexually assaulted. They also come in having had their partners watch their every move by installing cameras in their homes, placing tracking devices on their cars, or constantly check-in through GPS applications on mobile phones to see exactly where they go.
She’s seen people who have so little control of their own finances they’ve had to stand at the check-out in the supermarket with a trolley full of groceries for their families, call their partners and ask for the exact amount required to complete the transaction, and wait for it to be transferred.
Domestic violence can occur in many ways. Its broad definition includes emotional abuse, psychological abuse, and financial abuse. The Australian Bureau of Statistics estimates that almost one in four women, and one in 20 men, have experienced emotional abuse by a current or previous partner. Meanwhile, approximately 16 per cent of women, and 8 per cent of men, have experienced financial abuse during their lives. “It’s really not that uncommon,” Feeney said.
Sometimes it’s obvious. Sometimes it’s subtle. Sometimes it’s watching and waiting until a partner is on the phone and then remotely blasting the home stereo. Sometimes it’s logging into a person’s inbox and deleting messages. Sometimes it’s accessing a smart home system and repeatedly raising the temperate to an extreme. Sometimes it’s using a smartphone to change the home alarm code.
It’s done cruelly, not just as a mere annoyance or a joke. It’s designed to assert dominance over the other person, make them doubt their judgment, question their memory, and develop anxiety. It’s commonly called gaslighting: manipulating someone into doubting their own sanity.
The obvious question, when this behaviour reaches a crisis point, is why the victim doesn’t leave? Dr Lillian Nejad, a clinical psychologist from Omnipsych in Melbourne, said it’s not that easy.
“Ongoing manipulation can cause people to stop trusting their own perceptions and interpretations and feelings. It can make people doubt their sanity and feel powerless and helpless, (making them) more dependent on their abuse,” she said. “It can have a long-term impact on a person’s mental health.”
Perhaps they fear retribution. Maybe their sense of self-worth has eroded to a point where they feel they deserve to be punished. It could be that they don’t know how to access help, they feel ashamed about their predicament, they are isolated from loved ones, or they have no financial means.
“Another aspect that keeps people there is hope,” Nejad said. “Abusers aren’t always abusive. They can be apologetic, calm, kind. This intermittent reinforcement can keep hope alive in the person being abused that the situation is improving, and a sense of obligation to stay.”
Unfortunately, technology is at the centre of many cases of modern-day abuse, particularly in cases of coercive control. It’s likely that the inventors, excited about finding new ways to connect people, never intended their devices to be used to inflict harm. It’s too easy to create a relentless pattern.
“I’ve seen a lot of sinister stuff going on, but personally I believe that few of us in the technical community intentionally create technology to cause harm,” said Lesley Nuttall, a security and policy expert at IBM. “It is possible that we are disconnected from the unintended effects of (different technological) creations.”
About a year ago, she and her UK-based team started looking into technology-facilitated abuse. They noticed the bulk of help focused on educating victims and survivors about technology, encouraging them to install two-factor authentication and take other security measures. However, she worried that apps being built to try and help people in abusive relationships could do more harm than good.
Say the abuser found an app storing evidence of abuse? It could trigger a severe escalation. Or if they found evidence of messages to family? They could remove access, increasing isolation.
After an extensive research process, Nuttall’s team developed a set of five principles designed to help technologists shift the onus of safety away from the end-user, the victim or survivor, and tackle it from the get-go through thoughtful design. Published by the IBM Policy Lab, these principles include things like promoting diversity in development. Guaranteeing privacy and choice by informing users when someone else accesses their data or location. Notifying the primary user when changes are made to their account. Strengthening security and data. And making technology more intuitive to empower users.
“There has always been a range of strategic behaviours used by abusers to exert control over their victim, but in the 21st century there is a range of technologies available to help. It’s just about shifting the modality,” said Hayley Foster, CEO of Women’s Safety NSW. She said technology-facilitated abuse, including constant surveillance, occurs in almost all cases of domestic violence.
“What we’re finding more is people using (smart home devices). We’ve had women who are literally watched 24/7 while they’re in the home, even while their partner is at work. They’ll get messages, a running commentary on what they should or shouldn’t be doing while they’re caring for infants, or cleaning, or cooking” she said. “They have to ask approval for every little detail.”
Small things can make a big difference, she said. The first step in helping someone in an abusive relationship is to ensure they have access to a safe phone they can use to access help. However, she agrees with IBM’s principals that being notified when someone else has control of an account, being reminded how many devices can access messages and location, being asked to change passwords often, and being able to maintain a record of what’s going on is critical not just to a person’s wellbeing, but their future.
“They’re wanting to believe themselves, but they’re often doubting their own judgment,” she says. “They will be looking for signs to validate themselves, and technology can help with that.”
If you need assistance please call 1800 RESPECT or triple-0 (000).
This article was created in partnership with IBM.