
Instagram, the popular social media platform owned by Meta, is facing renewed scrutiny after reports emerged that it recommends sexually suggestive and explicit content to teenage users. The allegations have sparked outrage among child safety advocates and parents, who are calling for stricter regulations on social media platforms.
What the Investigation Found
An investigation revealed that Instagram's algorithm frequently suggests accounts featuring adult content to users as young as 13. These recommendations appear in the platform's 'Suggested Posts' and 'Explore' sections, even when teens follow only age-appropriate accounts.
Meta's Response
Meta has stated that it employs 'industry-leading tools' to protect young users and removes content that violates its policies. However, critics argue that the company's algorithms prioritise engagement over safety, leading to inappropriate recommendations.
Growing Calls for Regulation
The findings have intensified demands for stronger online safety measures, particularly ahead of the UK's Online Safety Bill becoming law. Child protection groups are urging platforms to implement stricter age verification and content moderation systems.
What Parents Can Do
- Enable parental controls on Instagram
- Regularly monitor your child's activity
- Discuss online safety openly with teens
- Report inappropriate content immediately
As the debate continues, pressure mounts on social media companies to balance algorithmic recommendations with child protection responsibilities.