Children Exposed to Guns and Self-Harm Within Minutes on Social Media Platforms
A shocking new study has revealed that British children are potentially being exposed to deeply harmful content, including guns, self-harm, misogyny, and explicit sexual material, within mere minutes of creating social media profiles. The research, conducted by the Big Tech's Little Victims campaign, highlights how technology firms design systems that prioritise engagement over safety, using powerful, unregulated algorithms that can escalate exposure to dangerous content rapidly.
Experiment Uncovers Alarming Trends Across Major Platforms
The experiment involved creating four fictional profiles on TikTok, Instagram, Snapchat, and YouTube, based on typical 13-year-old girls and boys in the UK with common interests such as gaming, beauty, music, and sport. Researchers used each platform for up to 30 minutes daily, scrolling as a child would, over the course of a week. The findings show that these profiles were served hundreds of pieces of concerning content, including material glamourising guns and knives, explicit references to sex and pornography, extreme fitness and diet promotions, and encouragement of misogyny, isolation, self-harm, and even suicide.
On average, concerning content appeared within just three minutes of logging on, with one piece of harmful or inappropriate material shown for every minute spent scrolling. In some sessions, harmful content was the very first thing served, and algorithmic loops made it difficult or impossible to escape escalating harm. For instance, in one 30-minute session on Snapchat, researchers flagged 86 pieces of concerning content.
Gender and Platform Disparities in Harmful Exposure
The study also revealed clear differences across platforms and genders. Concerning content appeared more frequently and escalated most sharply on TikTok and Snapchat, while Instagram was mostly age-appropriate. An adult researcher using Snapchat reported having to step away due to extreme self-harm and suicide ideation content.
Girls were disproportionately served extreme body-focused and sexualised content, such as "thinspiration" and body shaming, often alongside self-harm and suicidal ideation material. TikTok served girl profiles content on extreme health, fitness, or diet in 92% of sessions and sexualised content in 83% of sessions. Meanwhile, boys were funnelled towards violence, misogyny, and radicalisation, with repeated exposure to weapons, hostile content about women, racist and anti-immigration narratives, and figures linked to extremist views like Tommy Robinson and Andrew Tate. On TikTok and YouTube, boys' profiles encountered hate speech or racist narratives in 77% of sessions, and on TikTok alone, misogynistic content appeared in 85% of boys' sessions compared to only 13% for girls.
Calls for Government Action and Industry Response
Daniel Kebede, general secretary of the National Education Union, which leads the campaign, stated: "What this experiment shows is shocking, but not surprising. Children are being exposed to deeply harmful content on social media, even when platforms know their age. This is not accidental – it is how these systems are designed." He emphasised that at 13, children's minds are still developing, yet they are targeted by algorithms built to maximise engagement at any cost, leading to rising misogyny, worsening concentration, and exhaustion among pupils.
Campaigners are reiterating calls for the government to raise the age of social media access to 16, warning that every day of delay leaves thousands more children exposed to harm and exploitation. Actress Natalie Cassidy, an ambassador for the campaign, called the findings "every parent's worst nightmare," urging the government to prioritise children's safety over Big Tech's profits.
In response, TikTok noted it limits content for under-18s, sets age limits for features like the 'For You' feed, and uses restrictive privacy settings for younger users, adding that it is reviewing the findings. Snapchat stated it is a visual communications app that doesn't apply algorithms to unvetted content and takes action against harmful material, while Meta highlighted updates to Teen Accounts on Instagram that limit exposure to sensitive content. The Independent has approached YouTube and the UK government for comment.



