Complying with the Digital Services Act

[ad_1]

Last year the European Union enacted a new set of regulations known as the Digital Services Act (DSA), designed to harmonize content regulations across the EU and create specific processes for online content moderation. The DSA applies to many different online services – from marketplaces and app stores to online video sharing platforms and search engines.

As a result, we have adapted many of our long-standing trust and safety processes and changed the operation of some of our services to comply with the DSA’s specific requirements. We look forward to continued engagement with the European Commission and other stakeholders, including technical and policy experts, as we progress this important work.

Planning ahead of today’s regulatory landscape

First and foremost, safety is good for users and good for our business. That’s why over many years across Google, we’ve made significant investments in people, processes, policies and technologies that address the goals of the DSA. A few examples:

  • Our Priority Flagger program (originally established in 2012 as YouTube’s Trusted Flagger program) addresses the objectives of the DSA’s Trusted Flagger provision, prioritizing review of content flagged to us by experts.
  • We give YouTube creators the option to appeal video removals or restrictions where they think we’ve made a mistake. The YouTube team reviews all creator appeals and decides whether to uphold or reverse the original decision. The DSA will require all online platforms to take similar measures and establish internal complaint-handling systems.
  • In the summer of 2021, after talking with parents, educators, child safety and privacy experts, we decided to block personalized advertising to anyone under age 18. The DSA will require other providers to take similar approaches.
  • Since launching YouTube’s Community Guidelines Enforcement Report in 2018 to increase transparency and accountability around our responsibility efforts, we’ve continued to publicly share a range of additional metrics, such as the Violative View Rate, to give more context about our work to protect users from harmful content.
  • And there are many other examples, over many years, of how we have continually introduced the types of trust and safety processes envisioned by the DSA.

Alongside these efforts, the Google Safety Engineering Center in Dublin, focused on content responsibility, has consulted with more than a thousand experts at more than a hundred events since its founding. The center helps regulators, policymakers and civil society get a hands-on understanding of our approach to content moderation and offers us valuable opportunities to learn from and collaborate with these experts.

Tailoring transparency and content moderation to the EU’s requirements

Complying at scale is not new to us. We have invested years of effort in complying with the European Union’s General Data Protection Regulation, and have built processes and systems that have enabled us to handle requests for more than five million URLs under Europe’s Right to be Forgotten.

Now, in line with the DSA, we have made significant efforts to adapt our programs to meet the Act’s specific requirements. These include:

  • Expanding ads transparency: We will be expanding the Ads Transparency Center, a global searchable repository of advertisers across all our platforms, to meet specific DSA provisions and providing additional information on targeting for ads served in the European Union. These steps build on our many years of work to expand the transparency of online ads.
  • Expanding data access for researchers: Building on our prior efforts to help advance public understanding of our services, we will increase data access for researchers looking to understand more about how Google Search, YouTube, Google Maps, Google Play and Shopping work in practice, and conducting research related to understanding systemic content risks in the EU.

We are also making changes to provide new kinds of visibility into our content moderation decisions and give users different ways to contact us. And we are updating our reporting and appeals processes to provide specified types of information and context about our decisions:

  • Shedding more light on our policies: We are rolling out a new Transparency Center where people can easily access information about our policies on a product-by-product basis, find our reporting and appeals tools, discover our Transparency Reports and learn more about our policy development process.
  • Expanding transparency reporting: More than a decade ago, we launched the industry’s first Transparency Report to inform discussions about the free flow of information and show citizens how government policies can impact access to information. YouTube also publishes a quarterly Transparency Report describing its Community Guidelines enforcement. In the months ahead, we will be expanding the scope of our transparency reports, adding information about how we handle content moderation across more of our services, including Google Search, Google Play, Google Maps and Shopping.
  • Analyzing risks – and helping others do so too: Whether looking at risks of illegal content dissemination, or risks to fundamental rights, public health or civic discourse, we are committed to assessing risks related to our largest online platforms and our search engine in line with DSA requirements. We will be reporting on the results of our assessment to regulators in the EU, and to independent auditors, and will publish a public summary at a later date.

[ad_2]

Source link

Share:

Atbildēt

3 latest news
On Key

Related Posts

Google AI Works goes back to school

[ad_1] Editor’s note: We are announcing the launch of a first-of-its-kind pilot with multi-academy trusts (MATs) Lift Schools and LEO Academy Trust to understand the

Solverwp- WordPress Theme and Plugin