{"id":20441,"date":"2025-06-12T11:00:32","date_gmt":"2025-06-12T11:00:32","guid":{"rendered":"https:\/\/scannn.com\/combating-nudify-apps-with-lawsuit-new-technology\/"},"modified":"2025-06-12T11:00:32","modified_gmt":"2025-06-12T11:00:32","slug":"combating-nudify-apps-with-lawsuit-new-technology","status":"publish","type":"post","link":"https:\/\/scannn.com\/lv\/combating-nudify-apps-with-lawsuit-new-technology\/","title":{"rendered":"Combating Nudify Apps with Lawsuit & New Technology"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p><span style=\"font-weight: 400\">Across the internet, we\u2019re seeing a concerning growth of so-called \u2018nudify\u2019 apps, which use AI to create fake non-consensual nude or sexually explicit images. Meta has longstanding rules against non-consensual intimate imagery, and over a year ago we <\/span><a href=\"https:\/\/transparency.meta.com\/en-gb\/policies\/community-standards\/adult-sexual-exploitation\/\"><span style=\"font-weight: 400\">updated these policies<\/span><\/a><span style=\"font-weight: 400\"> to make it even clearer that we don\u2019t allow the promotion of nudify apps or similar services. We remove ads, <\/span><span style=\"font-weight: 400\">Facebook Pages and Instagram accounts promoting these services when we become aware of them, block links to websites hosting them so they can\u2019t be accessed from Meta platforms, and restrict search terms like \u2018nudify\u2019, \u2018undress\u2019 and \u2018delete clothing\u2019 on Facebook and Instagram so they don\u2019t show results.<\/span><\/p>\n<p><span style=\"font-weight: 400\">Today, we\u2019re sharing updates in our approach to tackling these apps, including new steps to remove them from our platforms, help other companies do the same and hold the people behind them accountable.<\/span><\/p>\n<h2>Taking Legal Action<\/h2>\n<p><span style=\"font-weight: 400\">We\u2019re suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent. We\u2019ve filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">This follows multiple attempts by Joy Timeline HK Limited to <\/span><a href=\"https:\/\/transparency.meta.com\/en-gb\/policies\/ad-standards\/deceptive-content\/circumventing-systems\/\"><span style=\"font-weight: 400\">circumvent<\/span><\/a><span style=\"font-weight: 400\"> Meta\u2019s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.<\/span><\/p>\n<p><span style=\"font-weight: 400\">This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it. We\u2019ll continue to take the necessary steps \u2013 which could include legal action \u2013 against those who abuse our platforms like this.\u00a0<\/span><\/p>\n<h2><b\/>Fighting Nudify Apps Across the Internet<\/h2>\n<p><span style=\"font-weight: 400\">With nudify apps being advertised across the internet \u2013 and available in App Stores themselves \u2013 removing them from one platform alone isn\u2019t enough. Now, when we remove ads, accounts or content promoting these services, we\u2019ll share information \u2013 starting with URLs to violating apps and websites \u2013 with other tech companies through the <\/span><a href=\"https:\/\/about.fb.com\/news\/2023\/11\/lantern-program-protecting-children-online\/\"><span style=\"font-weight: 400\">Tech Coalition\u2019s Lantern program<\/span><\/a><span style=\"font-weight: 400\">, so they can investigate and take action too. Since we started sharing this information at the end of March, we\u2019ve provided more than 3,800 unique URLs to participating tech companies. We already share signals about violating child safety activity, including sextortion, with other companies, and this is an important continuation of that work.<\/span><span style=\"font-weight: 400\"><br \/><\/span><\/p>\n<h2>Strengthening Our Enforcement Against Adversarial Advertisers<\/h2>\n<p><span style=\"font-weight: 400\">Like other types of online harm, this is an adversarial space in which the people behind it \u2013 who are primarily financially motivated \u2013 continue to evolve their tactics to avoid detection. For example, some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">That\u2019s why we\u2019re also evolving our enforcement methods. For example, we\u2019ve developed new technology specifically designed to identify these types of ads \u2013 even when the ads themselves don\u2019t include nudity \u2013 and use matching technology to help us find and remove copycat ads more quickly. We\u2019ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect within these ads.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">We\u2019ve also applied the tactics we use to disrupt networks of coordinated inauthentic activity to find and remove networks of accounts operating these ads. Since the start of the year, our expert teams have run in-depth investigations to expose and disrupt four separate networks of accounts that were attempting to run ads promoting these services.<\/span><\/p>\n<h2>Supporting Effective Legislation<\/h2>\n<p><span style=\"font-weight: 400\">We welcome legislation that helps fight intimate image abuse across the internet, whether it\u2019s real or AI-generated, and that complements our longstanding efforts to help prevent this content from spreading online through tools like <\/span><a href=\"http:\/\/stopncii.org\"><span style=\"font-weight: 400\">StopNCII.org<\/span><\/a><span style=\"font-weight: 400\"> and <\/span><a href=\"https:\/\/takeitdown.ncmec.org\/\"><span style=\"font-weight: 400\">NCMEC\u2019s Take It Down<\/span><\/a><span style=\"font-weight: 400\">. That\u2019s why we championed and are working to implement the new U.S. TAKE IT DOWN Act, an important bipartisan step forward in fighting this kind of abuse across the internet and supporting those affected.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">We also continue to <\/span><a href=\"https:\/\/medium.com\/@AntigoneDavis\/parenting-in-a-digital-world-is-hard-congress-can-make-it-easier-6039e7a350b1\"><span style=\"font-weight: 400\">support legislation<\/span><\/a><span style=\"font-weight: 400\"> that empowers parents to oversee and approve their teens\u2019 app downloads. As well as sparing parents the burden of repeated approvals and age verification across the countless apps their teens use, this would also allow parents to see if their teen is attempting to download a nudify app from an App Store, and prevent them from doing so.<\/span><\/p>\n<\/p><\/div>\n<p><script async defer crossorigin=\"anonymous\" src=\"https:\/\/connect.facebook.net\/en_US\/sdk.js#xfbml=1&#038;version=v5.0\"><\/script><br \/>\n<br \/><br \/>\n<br \/><a href=\"https:\/\/about.fb.com\/news\/2025\/06\/taking-action-against-nudify-apps\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Across the internet, we\u2019re seeing a concerning growth of so-called \u2018nudify\u2019 apps, which use AI to create fake non-consensual nude or sexually explicit images. Meta has longstanding rules against non-consensual intimate imagery, and over a year ago we updated these policies to make it even clearer that we don\u2019t allow the promotion of nudify apps [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":20442,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[123],"tags":[],"class_list":["post-20441","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-facebook"],"_links":{"self":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/20441","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/comments?post=20441"}],"version-history":[{"count":0,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/posts\/20441\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media\/20442"}],"wp:attachment":[{"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/media?parent=20441"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/categories?post=20441"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scannn.com\/lv\/wp-json\/wp\/v2\/tags?post=20441"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}