{"id":21396,"date":"2026-02-26T12:54:46","date_gmt":"2026-02-26T12:54:46","guid":{"rendered":"https:\/\/scannn.com\/new-alerts-to-let-parents-know-if-their-teen-may-need-support\/"},"modified":"2026-02-26T12:54:46","modified_gmt":"2026-02-26T12:54:46","slug":"new-alerts-to-let-parents-know-if-their-teen-may-need-support","status":"publish","type":"post","link":"https:\/\/scannn.com\/lv\/new-alerts-to-let-parents-know-if-their-teen-may-need-support\/","title":{"rendered":"New Alerts to Let Parents Know if Their Teen May Need Support"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p><span style=\"font-weight: 400\">In the coming weeks, Instagram will start notifying parents using supervision if their teen repeatedly tries to search for terms related to suicide or self-harm within a short period of time. This is the latest protection for <\/span><a href=\"https:\/\/about.fb.com\/news\/2024\/09\/instagram-teen-accounts\/\"><span style=\"font-weight: 400\">Teen Accounts<\/span><\/a><span style=\"font-weight: 400\"> and Instagram\u2019s parental supervision features.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">We understand how sensitive these issues are, and how distressing it could be for a parent to receive an alert like this. The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support. These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen.<\/span><\/p>\n<h2>How the Alerts Will Work<\/h2>\n<p><span style=\"font-weight: 400\">Next week, parents and teens enrolled in supervision will be notified that Instagram will start sending these new alerts to parents, based on their teens\u2019 search activity. Attempted searches that would prompt the alert include phrases promoting suicide or self-harm, phrases that suggest a teen wants to harm themselves, and terms like \u2018suicide\u2019 or \u2018self-harm\u2019.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">The alerts will be sent to parents via email, text, or WhatsApp, depending on the contact information available, as well as through an in-app notification. Tapping on the notification will open a full-screen message explaining that their teen has repeatedly tried to search Instagram for terms associated with suicide or self-harm within a short period of time. Parents will also have the option to view expert resources designed to help them approach potentially sensitive conversations with their teen.<\/span><\/p>\n<p><span style=\"font-weight: 400\">These alerts will roll out to parents who use Instagram\u2019s parental supervision tools in the US, UK, Australia, and Canada next week, and will become available in other regions later this year.<\/span><\/p>\n<h2>Striking the Right Balance<\/h2>\n<p><span style=\"font-weight: 400\">Our goal is to empower parents to step in if their teen\u2019s searches suggest they may need support. We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall.\u00a0<\/span><\/p>\n<blockquote>\n<p><span style=\"font-weight: 400\">\u201cWhen a young person searches about suicide or self-harm, empowering a parent to step in can be extremely important. The fact that Meta has now built this in is a meaningful step forward and is the kind of change that child safety experts have been pushing for.\u201d<\/span><\/p>\n<p><em><span style=\"font-weight: 400\">\u2013 Dr Sameer Hinduja, <\/span><span style=\"font-weight: 400\">Co-Director of the <\/span><a href=\"http:\/\/cyberbullying.org\/\"><span style=\"font-weight: 400\">Cyberbullying Research Center\u00a0<\/span><\/a><\/em><\/p>\n<\/blockquote>\n<p><span style=\"font-weight: 400\">In working to strike this important balance, we analyzed Instagram search behavior and consulted with experts from our Suicide and Self-Harm Advisory Group. We chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution. While that means we may sometimes notify parents when there may not be real cause for concern, we feel \u2014 and experts agree \u2014 that this is the right starting point, and we\u2019ll continue to monitor and listen to feedback to make sure we\u2019re in the right place.\u00a0<\/span><\/p>\n<blockquote>\n<p><span style=\"font-weight: 400\">\u201cIt\u2019s vital that parents have the information they need to support their teens. This is a really important step that should help give parents greater peace of mind \u2013 if their teen is actively trying to look for this type of harmful content on Instagram, they\u2019ll know about it.\u201d<\/span><\/p>\n<p><i><span style=\"font-weight: 400\">\u2013 Vicki Shotbolt, CEO Parent Zone<\/span><\/i><\/p>\n<\/blockquote>\n<h2>Building on Existing Protections<\/h2>\n<p><span style=\"font-weight: 400\">These alerts build on our existing work to help protect teens from potentially harmful content on Instagram. We have <\/span><a href=\"https:\/\/transparency.meta.com\/en-gb\/policies\/community-standards\/suicide-self-injury\/\"><span style=\"font-weight: 400\">strict policies<\/span><\/a><span style=\"font-weight: 400\"> against content that promotes or glorifies suicide or self-harm and, while we do allow people to share content about their own struggles with these issues, <\/span><a href=\"https:\/\/transparency.meta.com\/policies\/age-appropriate-content\/\"><span style=\"font-weight: 400\">we hide this content from teens<\/span><\/a><span style=\"font-weight: 400\">, even if it\u2019s shared by someone they follow.\u00a0<\/span><\/p>\n<p><a href=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg\"><img loading=\"lazy\" data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-47292\" src=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?resize=960%2C836\" alt=\"Blocked search terms UI\" width=\"960\" height=\"836\" srcset=\"https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=1920 1920w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=300 300w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=768 768w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=1024 1024w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=1536 1536w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=1240 1240w, https:\/\/about.fb.com\/wp-content\/uploads\/2026\/03\/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources.jpg?w=689 689w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\"\/><\/a><\/p>\n<p><span style=\"font-weight: 400\">We work to block searches for terms clearly associated with suicide and self-harm, including terms that violate our suicide and self-harm policies. This means we don\u2019t show any results and instead direct people to resources and local organizations that can help. We also direct people to resources and helplines when their searches aren\u2019t clearly related to suicide and self-harm, but mental health more broadly. We\u2019ll continue to alert the emergency services when we become aware of anyone at imminent risk of physical harm \u2014 actions that have saved lives.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">We\u2019re launching these alerts on Instagram search first, but we know teens are increasingly turning to AI for support. While our AI is already trained to respond safely to teens and provide resources on these topics as appropriate, we\u2019re now building similar parental alerts for certain AI experiences. These will notify parents if a teen attempts to engage in certain types of conversations related to suicide or self-harm with our AI. This is important work and we\u2019ll have more to share in the coming months.\u00a0<\/span><\/p>\n<\/p><\/div>\n<p><script async defer crossorigin=\"anonymous\" src=\"https:\/\/connect.facebook.net\/en_US\/sdk.js#xfbml=1&#038;version=v5.0\"><\/script><br \/>\n<br \/><br \/>\n<br \/><a href=\"https:\/\/about.fb.com\/news\/2026\/02\/new-meta-alerts-let-parents-know-if-teen-may-need-support\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In the coming weeks, Instagram will start notifying parents using supervision if their teen repeatedly tries to search for terms related to suicide or self-harm within a short period of time. This is the latest protection for Teen Accounts and Instagram\u2019s parental supervision features.\u00a0 We understand how sensitive these issues are, and how distressing it [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":21397,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[123],"tags":[],"class_list":["post-21396","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-facebook"],"_links":{"self":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/21396","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/comments?post=21396"}],"version-history":[{"count":0,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/21396\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media\/21397"}],"wp:attachment":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media?parent=21396"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/categories?post=21396"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/tags?post=21396"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}