Beginning Friday, August 25, users of X (formerly Twitter), TikTok and the other large social media apps in the European Union saw major changes to those platforms. The largest social media apps, search engines and app stores in the EU now fall under the jurisdiction of the Digital Services Act. Friday was the deadline set by the European Union for companies named Very Large Online Platforms or Very Large Online Search Engines to change how their AI and advertising work.
Two major effects of the DSA can be seen right away: increased scrutiny of the ways in which misinformation can spread and a return to chronological social media feeds as opposed to automated recommendations.
The online platforms affected are Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, X (listed as Twitter), Wikipedia, YouTube, the European clothing retailer Zalando, Bing and Google Search.
What is the Digital Services Act?
The Digital Services Act is legislation from the European Commission, which is a governmental body that is part of the executive branch of the European Union. The DSA regulates how designated Very Large Online Platforms handle privacy, protect their users (including minors) and operate transparently.
“It is a first-of-a-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries,” said Johannes Bahrke, coordinating spokesperson for Digital Economy, Research and Innovation in the European Commission, in an email to TechRepublic.
SEE: The world looks to the European Union’s AI draft law as a template for possible future regulation. (TechRepublic)
The obligations the largest online platforms must hold themselves to are roughly organized into four categories:
- More user empowerment: This includes letting users opt out of recommendation systems such as social media suggestion algorithms; ads cannot be based on protected information such as race; and terms and conditions must be clearly understandable.
- Strong protection of minors: This means no advertising can be targeted toward children, and platforms must make special risk assessments regarding the mental health of children.
- Measures to prevent disinformation
- Transparency and accountability: This includes content moderation, risk management and advertising. All compliance with the DSA obligations is subject to independent audit.
“The whole logic of our rules is to ensure that technology serves people and the societies that we live in — not the other way around,” said Margrethe Vestager, executive vice president of the European Commission for a Europe Fit for the Digital Age, in the announcement of which platforms would fall under the act’s jurisdiction. “The Digital Services Act will bring about meaningful transparency and accountability of platforms and search engines and give consumers more control over their online life. The designations made today are a huge step forward to making that happen.”
For CTOs, the DSA will be an experiment in whether regulation in the tech industry can foster, not stifle, innovation. Developers should also be on the lookout for changes in regulations depending on their geographical areas, which might impact what tasks they are assigned.
What does the DSA require?
The DSA requires European Union member states to create national authorities to enforce the act by February 17, 2024. The European Commission is working on creating a framework in which to carry out enforcement on social media and other digital platforms. The European Centre for Algorithmic Transparency will help enforce the DSA by assessing whether the algorithms used for social media recommendations fall within the act’s risk management requirements.
Businesses will need to adapt to the DSA by ensuring they have proper risk assessments and other compliance in place if they operate in the EU. Organizations that use social media may also see different behavior from their social media audiences or advertising because of some audience members switching from recommended to chronological feeds.
How the DSA affects algorithmic recommendation systems and trustworthiness
The DSA defines the information that will be under increased scrutiny as “disinformation, hoaxes and manipulation during pandemics, harms to vulnerable groups and other emerging societal harms.” Very Large Online Platforms and social media sites will need to provide a yearly risk assessment of how they handle these types of information.
“Platforms must mitigate against risks such as disinformation or election manipulation … These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audits,” the commission wrote.
How the DSA intersects with GDPR
Social media in the European Union is also subject to the General Data Protection Regulation, which went into effect in 2018. The GDPR guarantees certain data privacy and security protections to anyone who lives within the EU. Companies that operate within the EU need to be GDPR compliant, regardless of their place of origin; occasionally, companies such as Meta are fined for breaking it.
The DSA was designed with GDPR compliance in mind. It should not result in any changes to how companies comply with GDPR.
What impact does the DSA have on companies outside the EU?
As many large tech companies are located in the U.S., the DSA impacts their EU business. The European Commission obligates tech companies operating in member states to appoint a legal representative for the market.
The DSA may also lead other governing bodies to consider similar rules.
“We realize that we face similar challenges as other like minded partners, most importantly the US, with which we have started a very important high-level tech dialogue [via] the Tech and Technology Council,” Bahrke said. The Tech and Technology Council, convened in 2021, is a political body dedicated to technology and trade between the U.S. and EU.
This is another way in which regulation can foster, not stifle, innovation. The regulation in this case goes hand-in-hand with international cooperation to encourage the economic good of both the US and EU.
Responses from social media giants
LinkedIn has been working on complying with the DSA for about a year, said Patrick Corrigan, the company’s vice president of legal – digital safety, in a blog post.
“We continue to believe that this type of transparency is important to maintain a safe, trusted, and professional platform and in connection with the DSA, we’re expanding it to include more information about our decisions (regarding the removal of inappropriate content), including whether automated systems or humans did the review,” Corrigan said.
“Meta has long advocated for a harmonised regulatory regime that effectively protects people’s rights online, while continuing to enable innovation,” wrote Meta Global Affairs President Nick Clegg in a blog post. “For this reason, we welcome the ambition for greater transparency, accountability and user empowerment that sits at the heart of regulations like the DSA, GDPR, and the ePrivacy Directive.”
Meta has provided non-chronological feeds in the EU and more information about its AI recommender systems in advance of the DSA deadline.