The Digital Services Act in 10 key points
“What is illegal offline, should also be seen as illegal online”. The internet has changed almost every aspect of our society in the last 2 decades. It has allowed people to communicate and share information with ease.
The Bodyguard.ai team
On the flip side, it has brought about new challenges, whether that be the explosion of toxic comments and “fake news” or extremist content on social networks. The European Union is seeking to put in place a legal framework to legislate and preserve users' rights.
What is the DSA?
Unlike the DMA (Digital market Act) which focuses on regulating the market along with ensuring a higher degree of competition in Europe and regulating the GAFAM’s activities (or VLOPs => Very Large Online Platforms), the DSA focuses on the services component and the regulation of platform and social network to better fight against disinformation and online hate. The focus is on making platforms such as Facebook, Twitter, YouTube, and others accountable for the content they host.
Digital Services Act
A new text of the law on par with the GDPR, the DMA, and ePrivacy (privacy and electronic communications)
The text presented in December 2020 and will enter into effect in 2023 + normally be applicable to all member countries by January 1, 2024
The DMA can be considered the legislative twin of the DSA
Who does it concern?
All social platforms no matter their size, with varying obligations depending on their role, size, and impact
UGC Platforms (User-generated Content): Google, Meta, or Twitter but also smaller platforms
Digital marketplaces: Ebay, Vinted, etc.
Our analysis of Thierry Breton's 10 key points from the Digital Services Legislation
With great power comes great responsibilities
The DSA sets clear and harmonized rules for all platforms - proportional to their size, impact & risk. For example, platforms with less than 45 million users will be exempt from certain DSA obligations.
A harmonized system to combat ALL forms of illegal content - from counterfeit or dangerous products to hate speech
Any national authority will be able to request the removal of illegal content, regardless of where the platform is established in Europe. The term illegal content includes: hate speech, sexual exploitation of children, scams, non-consensual sharing of private images, promotion of terrorism, sale of counterfeit or dangerous products, and copyright infringements.
Protect fundamental rights, including media pluralism, in content moderation
Users will have a hand in choosing how they receive recommendations & content. A user's account can and will only be suspended based on specific and predictably applied rules.
The end of the "I'm just the middle man" excuse!
More protection for consumers in digital marketplaces:
New sanctions for DSA violations
The obligation to have a contact point in Europe
More transparency on the products sold
Information on your business client
You can (finally!) speak your own language
Platforms will need to have the necessary resources for content moderation: number of moderators, specific language skills, etc.
A ban on child-targeted advertisement
More protection against targeted advertisements aimed at children - or based on sensitive data.
Opening up platforms' "black box" of algorithms
Platforms will now have the obligation to be transparent on their recommendation systems.
For example, they will have to disclose whether they use filters or automatic algorithms for content moderation.
Terms and conditions are understandable by all
Terms and conditions on platforms will now be clear, accessible, and understandable. Gone are the days when you had to “read” 50 pages of small print technical jargon.
Possibility of direct supervision by the European Commission of very large platforms
The European Commission will have specialized teams on hand and will hire experts in data, algorithms, business models...
There will also be reliance on experts on this subject from the Member States.
A possible crisis intervention scheme
We cannot rely solely on the goodwill of platforms in the face of crises, pandemics, or wars. Europe needs a legal tool that requires the major digital players to react quickly in cases of emergency.
The European Commission has adopted a text of law with a similar one being voted on in Parliament on January 20th. All that is currently left to do, is to agree on a single text of law: this stage is called the "trialogue" phase. It is difficult to say how long this will take, but it is likely to take place during France's term as European president, i.e. before the end of June 2022.
The 10 key points that DSA will implement:
Clear rules will be set according to the size, impact, and risk of social platforms
One of the key objectives is the fight against all forms of illegal content: from counterfeit or dangerous products to hate speech
The preservation of fundamental rights, including media pluralism, will be taken into account in content moderation
Consumers on marketplaces will be better protected: obligation of a contact point in Europe, more transparency on the products sold...
Content moderation will need to be properly implemented on platforms: having an adequate number of moderators, having access to support that speaks your language
Targeted advertisement for children will be prohibited and more protection will be put in place
Platforms will have to share their referral systems: for example, more clarity on the technical inner workings of their platforms
Terms and conditions that are easier to read and understand
The possibility of platform supervision done directly by the European Commission with specialized teams (experts in data, algorithms, business models, or member states directly...)
A legal tool will be implemented to ensure a rapid response from major platforms in the event of an emergency (crisis communication, pandemics, wars, etc.)