Tech Abuse Surge: Smart Devices and AI Weaponised Against Women, Refuge Warns
Smart Tech and AI Increasingly Used in Domestic Abuse, Charity Reports

Tech Abuse Surge: Smart Devices and AI Weaponised Against Women, Refuge Warns

Women's advocacy groups are issuing urgent calls for technology developers to prioritise women's safety in product design, as domestic abusers increasingly exploit digital tools to attack and control their victims. The domestic abuse charity Refuge has revealed alarming statistics showing a sharp rise in technology-facilitated abuse cases, with record numbers of women referred to their specialist services in the final quarter of 2025.

Record Referrals and Complex Cases

Refuge's latest data shows a significant escalation in technology-enabled abuse, with a 62% increase in the most complex cases during the last three months of 2025, totalling 829 women. Additionally, there was a 24% rise in referrals involving women under thirty years old, indicating that younger generations are particularly vulnerable to these emerging forms of digital coercion.

Emma Pickering, head of the tech-facilitated abuse team at Refuge, emphasised the severity of the situation. "Time and again, we see what happens when devices go to market without proper consideration of how they might be used to harm women and girls," she stated. "It is currently far too easy for perpetrators to access and weaponise smart accessories, and our frontline teams are seeing the devastating consequences of this abuse."

Weaponised Wearables and Smart Technology

Recent cases documented by Refuge reveal how perpetrators are exploiting various technologies to maintain control over their victims. Wearable devices such as smartwatches, Oura rings, and Fitbits are being used to track and stalk women, while smart home systems that control lighting and heating are manipulated to disrupt daily lives and create psychological distress.

One survivor, Mina, shared her harrowing experience of fleeing an abusive partner only to discover he was using her abandoned smartwatch to track her movements through linked cloud accounts. "It was deeply shocking and frightening. I felt suddenly exposed and unsafe, knowing that my location was being tracked without my consent," she recounted. "It created a constant sense of paranoia; I couldn't relax, sleep properly, or feel settled anywhere because I knew my movements weren't private."

Despite police involvement, Mina was located at her subsequent refuge accommodation by a private investigator hired by her abuser, who likely used technological tracking methods. Authorities informed her that no crime had been committed since she had "not come to any harm," highlighting significant gaps in legal protections against technological stalking.

AI Manipulation and Emerging Threats

Beyond physical tracking devices, abusers are increasingly turning to artificial intelligence tools to manipulate and control survivors. Pickering described how perpetrators use AI spoofing applications to impersonate individuals and alter videos to make survivors appear intoxicated or unstable, potentially undermining their credibility with social services or in custody disputes.

"We'll see more and more of that as these videos and applications advance," Pickering warned, noting that AI is also being used to create convincing fraudulent documents such as job offers or legal summons. These fabricated materials can be deployed to convince survivors they owe debts or to lure them to locations where their abusers await.

Looking ahead, Pickering expressed concern about the potential misuse of medical technology, including the possibility of abusers manipulating insulin levels through diabetes tracking devices—a practice that could prove fatal.

Calls for Regulatory Action and Industry Accountability

Refuge is urging both government and technology companies to take decisive action to address this growing crisis. Pickering criticised current regulatory frameworks, stating that "Ofcom and the Online Safety Act don't go far enough" in protecting vulnerable individuals from technology-facilitated abuse.

The charity is calling for:

  1. Increased government funding to develop and train specialised digital investigation teams
  2. Stronger accountability measures for technology companies regarding safety-by-design principles
  3. Regulatory frameworks that prioritise women's safety from the earliest stages of product development

"It is unacceptable for the safety and wellbeing of women and girls to be treated as an afterthought once a technology has been developed and distributed," Pickering asserted. "Their safety must be a foundational principle shaping both the design of wearable technology and the regulatory frameworks that surround it."

A government spokesperson responded that tackling violence against women and girls in all forms, including technology-facilitated abuse, remains a top priority. They referenced the new Violence Against Women and Girls strategy and ongoing collaboration with Ofcom to address disproportionate online abuse targeting women.

As technology continues to advance at a rapid pace, Refuge's warnings highlight the urgent need for proactive safety measures to prevent innovative tools from becoming instruments of coercion and control in domestic abuse situations.