Instagram now alerts parents if their teen searches for suicide or self-harm content
Parents will be informed if their teen searches for suicide or self-harm content and offered resources.
An email chain with Instagram head Adam Mosseri indicated the company was aware of teen safety issues in DMs in 2018, but didn’t launch its unwanted nudity filter until 2024.
Meta’s CEO was questioned over the addictive nature of its social media apps like Instagram, and other teen harms.
Instagram introduced custom app icons for its teen accounts, but not everyone is thrilled. In replies to the company’s post on X, users criticized the feature for being age-gated, arguing that adults like personalization, too.
YouTube is introducing age detection technology to identify teens on the platform in the U.S. and apply prtections.
The update will impact the accounts of family vloggers/creators and parents running accounts for “kidfluencers,” both of which have faced criticism for the risks associated with sharing children’s lives on social media.
72% of US teens have tried AI companions and 52% said they use the regularly.