(WDNews) – Instagram said Thursday it will begin notifying parents if their teens repeatedly search for terms clearly associated with suicide or self-harm on the platform.
The alerts will be sent only to parents enrolled in Instagram’s parental supervision program. Notifications may be delivered by email, text message or WhatsApp, depending on the parent’s contact information, and will also appear as a notification within the parent’s Instagram account.
The social media platform, owned by Meta Platforms Inc., said it already blocks such content from appearing in teen users’ search results and instead directs users to appropriate helplines and support resources.
In a blog post announcing the change, Meta said the goal is to give parents additional tools to intervene when necessary.
“Our goal is to empower parents to step in if their teen’s searches suggest they may need support,” the company said. “We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall.”
Meta also said it is developing similar parental notifications tied to teens’ interactions with artificial intelligence tools on its platforms. The company said parents would be alerted if a teen attempts to engage in certain types of conversations related to suicide or self-harm with its AI systems.
“This is important work and we’ll have more to share in the coming months,” the company said.
The announcement comes as Meta faces mounting legal scrutiny over the impact of its platforms on young users. The company is currently involved in two separate trials.
One trial underway in Los Angeles is examining whether Meta’s platforms are deliberately designed in ways that addict and harm minors. Another case in New Mexico is focused on whether the company failed to protect children from sexual exploitation on its platforms.
Thousands of families, along with school districts and government entities, have filed lawsuits against Meta and other social media companies. The suits allege the companies intentionally design their platforms to be addictive and fail to adequately shield children from content that can contribute to depression, eating disorders and suicide.
Meta executives, including CEO Mark Zuckerberg, have disputed claims that the company’s platforms cause addiction or direct mental health harm. During questioning in the Los Angeles trial, Zuckerberg said he stands by a previous statement that the existing body of scientific research has not proven that social media causes mental health harms.


