On November 16, 2022, the Digital Services Act (DSA) took effect across the European Union (EU). The DSA establishes new regulations applicable to “online intermediaries,” such as online marketplaces, social network platforms, and internet service providers. The DSA was implemented to encourage market growth and establish clear and transparent accountability for digital spaces. Although the DSA has been in effect for nearly eight months, the European Parliament (“Parliament”) has allowed for a transitional period before full application. This transitional period ends on February 17, 2024. Beginning on this date, organizations must have the requisite procedures in place to address DSA requirements.

Who is covered?

By enacting the DSA, Parliament created new requirements for online intermediary service providers. Per Parliament’s broad definition, online intermediary service providers (“Providers”) can encompass all information society services (services provided for compensation after a request from a consumer). Specifically, DSA covers Providers that:

  • are a “mere conduit” for information and/or access (i.e. telecommunication service providers);
  • store, or host, information to provide to the user and/or a third party (i.e. app stores, content-sharing platforms, and online travel and accommodation platforms); or
  • permanently store information (i.e. cloud services).

What is required for covered organizations? 

The DSA details requirements for Providers to enhance trust and transparency in the online ecosystem. The DSA requires Providers to:

  • publish annual transparency reports on their content moderation activities. (These reports must include the measures they take to apply and enforce their terms and conditions);  
  • cooperate with national authorities following orders, such as providing an updated relevant supervisory authority of any follow-up to the order;
  • conspicuously publish clear terms and conditions for their content moderation practices, and must provide easily accessible information on the right to terminate the use of their services; and
  • designate a single electronic point of contact for official communication with supervisory authorities in the EU, even if the Provider does not reside in the EU.

The DSA will impose additional requirements, based on a tiered approach, depending on the Provider’s size and the type of services the Provider offers. All Providers of online platforms and online engines, except small and micro platforms (as defined in Commission Recommendation 2003/361/EC), were expected to publish their number of active users on the Provider’s website by February 17, 2023. This action is to be repeated every six months. The European Commission (“Commission”) encouraged Providers to share their numbers with the Commission to assist with the category designation process. There are four categories:

  • Intermediary service providers (a broad category, that divides into several sub-categories, such as internet access providers, domain name registrars, and the three sub-categories listed below);
  • Hosting services (i.e. cloud services that store information for the user);
  • Online platforms, or Providers that bring together sellers and consumers (i.e. Social media platforms); and
  • Very large online platforms (VLOPs), online platforms with more than 45 million active monthly users in the EU, and very large online search engines (VLOSEs), online search engines with more than 45 million active monthly users in the EU.  

On April 25, 2023, the Commission’s category released its first round of designations and detailed next steps for designated VLOPs and VLOSEs. Based on category designation, the additional requirements range from Providers needing to report criminal offenses to national law enforcement or judicial authorities, to annual risk assessments for VLOPs and VLOSEs. A more comprehensive list of the additional requirements outlined in the DSA can be found here.

Although the DSA imposes requirements for Providers to combat illegal content on their platforms, the DSA does not define or provide examples as to what constitutes “illegal content.” As a result, when enforcing the DSA, Member States may experience challenges in administering their own definition of illegal content. This is because the general rule in the EU is that if content is illegal in a specific Member State, then the content “should only be removed in the territory where it is illegal.”

How does the DSA impact US businesses?

As stated above, the DSA applies to all Providers who offer their services to users residing in the EU, regardless of whether or not the Provider resides in the EU.

Moving forward, US businesses should be mindful as to whether they qualify as a Provider under Parliament’s broad definition, as a business’s failure to address requirements under the DSA could result in fines and private law enforcement.

Member States must appoint a Digital Service Coordinator by February 17, 2024, whom will enforce DSA regulations and impose fines as deemed appropriate. Fines can be up to six percent of a Provider’s annual global turnover. Private actors, such as users of online services, can file a complaint to the Digital Services Coordinator when the private actor has reason to believe the Provider has violated the DSA. This filing can lead to an official proceeding against the Provider in a Member State court in which the private actor is domiciled. Successful proceedings can result in the private actor recovering monetary damages and loss from the Provider.

If a US business qualifies as a VLOP or VLOSE, then the Commission will also enforce DSA regulations for the Provider. As per the DSA, the Commission is entitled to charge an annual fee to VLOPs and VLOSEs for supervising services. The amount of the Commission’s fee will be based on the overall amount of the costs incurred by the Commission to exercise its supervisory tasks, as reasonably estimated beforehand. More information on the Commission’s supervisory fee can be found here.

If you have questions surrounding the DSA and DSA requirements, reach out to a member of Taft’s Privacy and Data Security Practice. For more information on data privacy and security regulations and other data privacy questions, please visit Taft’s Privacy & Data Security Insights blog and the Taft Privacy and Data Security mobile application.

Taft Summer Associate Celeste Friel contributed to the research and writing of this article. Celeste attends the University of Dayton School of Law in Dayton, Ohio.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Jennifer Brumby Jennifer Brumby

Jennifer represents clients throughout Ohio in both federal and state courts in litigation matters related to employment agreements, personnel policies and workers’ compensation. With previous experience working in-house as a human resources manager and attorney in the public sector, Jennifer has direct experience…

Jennifer represents clients throughout Ohio in both federal and state courts in litigation matters related to employment agreements, personnel policies and workers’ compensation. With previous experience working in-house as a human resources manager and attorney in the public sector, Jennifer has direct experience working with clients on labor and employment issues such as recruitment, performance evaluations, disciplinary actions, benefits programs, collective bargaining matters and ensuring employee procedures and policies are in compliance with state and federal laws.

Photo of Zachary Heck Zachary Heck

Zach’s practice focuses on privacy and data security. Specifically, Zach assists clients in the areas of privacy compliance, defense litigation, class action defense and guidance in the aftermath of an information security event, including data breach. Zach has experience advising clients with respect…

Zach’s practice focuses on privacy and data security. Specifically, Zach assists clients in the areas of privacy compliance, defense litigation, class action defense and guidance in the aftermath of an information security event, including data breach. Zach has experience advising clients with respect to FTC investigations, federal privacy regulations such as HIPAA, FCRA, TCPA, and GLBA, as well as state laws governing personally identifiable information. For his clients, he also provides regulatory analysis, risk management, policy development, training and audits.