fbpx

Digital Services Act: new rules for Big Tech

Legal - September 4, 2023

Almost every aspect of our lives is online. While this in many respects appears to be a positive factor, such as when it comes to some working aspects, for example, are greatly simplified, when it comes to the personal sphere of each of us, on the other hand, being totally online can create quite a few problems.

Aware of this, it is therefore essential that the net also be regulated and that even on the web universe rights be guaranteed to users who surf, especially when it comes to weaker subjects, such as minors.

The need to regulate life online has become a necessity that can no longer be procrastinated.

And so the European Union has decided to put all the necessary tools in place, crafting the Digital Services Act, which officially comes into effect on Friday 25th August 25 2023.

For the first time, we have a common set of rules defining the obligations and responsibilities of intermediaries within the single market will open up new opportunities regarding the provision of digital services across borders, while ensuring a high level of protection for all users, regardless of where they reside in the EU.

The Digital Services Act (shortly, DSA) is a truly revolutionary legislation, considering that the last time such a regulation was put in place over twenty years ago, which made that rules difficult to apply to the current context.

The new European regulation is being applied uniformly throughout the European Union. Thus, users will be guaranteed the same rights in every member state, without any difference. And, likewise, companies will also have a single regulation as their frame of reference, no longer having to deal with legislation that differs each time depending on where they are located. The DSA targets brokerage services, hosting services, and online platforms, with a focus on large ones. All online intermediaries who place their services in the single market will be subject to the DSA, regardless of whether they are based in the EU or not.

The main innovation is the fact that all these actors will have to be subject to different rules depending on their dimension. This therefore means that for the so-called Big Techs there will be increasingly stringent demands, precisely considering the fact that their influence is greater than that of other actors. Companies defined as Big Tech today are no longer an abstract entity and a journalistic term, but have been defined by the European Union itself. In fact, the European Commission on 25th April 2023 drew up a special list of companies, dividing them into VLOPs (very large online platforms) and VLOSEs (very large online search engines). A specific criterium was followed to draw up this list, namely to identify the largest platforms, sites, companies that are able to produce the greatest impact on European citizens.

In particular, the so-known Big Tech industries, by virtue of their large size, are more prone to the spread of illegal content, consequently damaging the entire European society. Therefore, specific regulations are provided for platforms that reach more than 10 percent of Europe’s 450 million consumers. These include, for example, the social networking sites Facebook, Instagram, Snapchat, TikTok, X (formerly Twitter), Linkedin, Pinterest, YouTube, and other sites such as Booking.com, Amazon, Zalando, Google Shopping, Alibaba, AliExpress, as well as Apple App Store and Google Play, Google Maps and Wikipedia, and finally also the two best-known search engines, namely Google and Microsoft’s Bing.

With regard to illegal content posted by users, Big Tech will be obliged to have a team dedicated to reports from authorities and users, for which a simpler and more effective system will have to be prepared. In fact, platforms will have to mandatorily remove a particular content when it is reported by national authorities or individuals. Sales sites will also have to pay more attention to the quality of products offered for sale, removing illegal products abruptly, and, should users have purchased them, these platforms will be required to notify the same buyer or make such information public on their website.

In addition to this, platforms will have to warn and specify to users at risk of suspension the details of the possible suspension decision. This is an important step, as it will no longer be sufficient just to refer to the violation of terms and conditions, but in fact users will have the right to be informed precisely about the reasons why their content has been removed, or its visibility or monetization has been restricted.

In addition, the terms and conditions will also have to be set out in a clearer and simpler way, so that they can be more understandable to everyone.

According to the DSA, large platforms will have to prepare an annual report assessing the risks to fundamental rights, freedom of expression, public debate, and minors related to unlawful abuse or use of their services. Once these risks are identified, they will have to present solutions to mitigate their impact. Companies will also be subjected to external audits.

Every six months, moreover, platforms will have to provide information that has so far remained in the shadows, such as details about the staff moderating their content, such as the size, skills and European languages spoken. This is an important step in terms of transparency and security.

Very strict limits are also placed on anything that could threaten people’s safety or health. In these cases, crisis protocols will be activated together with the Commission n order to reduce the harmful effect of such content.

Another central element of the DSA concerns the controversial issue of algorithms. Under the new regulation, it will no longer be the platforms that decide how the user views content, but, instead, users will have the choice of viewing content in the way the algorithm proposes, in a personalized way or in a chronological way. This measure thus provides for being less exposed to external influences.

Similarly, restrictions have been placed on what concerns advertising. Under the DSA, online advertising will not be allowed to use information regarding sensitive data such as religion, health, sexual orientation, and most importantly, it will be prohibited to use children’s data to offer them personalized advertising.

Companies will have to keep track of advertising investors, for each advertising post, they will have to keep information regarding who advertised it and who paid for the sponsorship, how long that post was shown and to which group (age, gender, interests, location) it was shown.

Finally, ban also on so-called dark patterns, which are specific tools that drive users to potentially harmful behavior regarding the processing of their personal data by influencing them to provide more data than necessary, through ploys in the content or web interface.

The Digital Services Act from February 2024 will be binding not only on Big Tech, but also on all those platforms that have a user level of less than 45 million, with penalties that may amount to 6 percent of global revenues.

The new rules contained in the Digital Services Act are intended to protect the rights of online consumers, which seems a priority at a historical moment like the one we are currently experiencing. But not only that, because through this regulation a common and clear framework in terms of transparency and accountability is finally established for online platforms, especially the larger ones that are able to influence a huge amount of people.

The new measures aim to make the web a safe environment for all European citizens, where there is greater democratic control and supervision of companies and platforms, so as to reduce risks to the health and safety of users and consequently also disinformation and manipulation of society. There has been no shortage of protests from those platforms involved, who have often considered such measures too stringent, so much so that they have appealed.

The fact that must remain a priority, however, is to ensure security, transparency and fair information for European citizens, even if certain interests of the big digital powers had to be sacrificed to do so.