The EU committee has X Corp. by letter dated 12.10.2023 urged to provide unspecified information on illegal content and disinformation.
Since the topic is topical again, you will find below the full text of the DSA and the discussion of the main articles (for those who did not see it in my first publication).
DSA in effect since August 25, 2023
REGULATION (EU) 2022/2065 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 19 October 2022 on a single market for digital services and amending Directive 2000/31/EG (better known as DIGITAL SERVICES ACT or DSA)
Since 25.08.2023, the European DSA is legally in force for all very large online platforms and search engines, hosting services and online companies (excluding small and micro companies operating under Art.29 are excluded from the Regulation).
Via this link you can consult the Dutch version of the 67-page Regulation. It is available in a variety of languages.
It is impossible to explain all aspects of the DSA. I will then limit myself to the most relevant points regarding the very large online platforms such as X, FB, Instagram … and the large online search engines such as Google, Mozilla, Edge …
“ Illegal content ”
The EU has deliberately chosen to broadly define the term “ illegal content ”. The description of the term in the text of the Regulation itself does not suggest its actual intentions.
Article 3 h) the term provides as follows :
“ illegal content ”: all information contained in itself or in connection with an activity, including the sale of products or the provision of services, is contrary to Union law or to the law of a Member State in accordance with Union law, irrespective of the precise object or nature of that right;
Article 3 k) also defines “ distribution to the public ”:
making information available to a potentially unlimited number of third parties at the request of the information-providing recipient of the service.
In point 12 of the Preamble, the EU says the following in this regard:
To ensure the objective of a safe, predictable and reliable online environment, for the purposes of this Regulation, the term “ illegal content ” should broadly reflect existing rules in the offline environment. In particular, the term “ illegal content ” should be broadly defined so that it also includes information related to illegal content, products, services and activities.
The EU’s actual intentions are hidden in points 83, 91 and 108 of the Preamble to the Regulation :
A fourth category of risk arises from similar concerns about the design, operation or use, including manipulation, of very large online platforms and of very large online search engines with an actual or foreseeable negative impact on the protection of public health, minors and serious negative consequences for a person’s physical and mental well-being, or gender-based violence. Such risks may also arise from coordinated public health disinformation campaigns, or from online interface design that could encourage behavioral addictions among service recipients.
In times of crisis, providers of very large online platforms may urgently need to take certain specific measures, in addition to the measures they take in the light of their other obligations under this Regulation. In this context, a crisis should be considered to arise in exceptional circumstances which could seriously threaten public security or public health in the Union or significant parts of the Union. Such crises can result from armed conflicts, emerging conflicts, terrorist acts, natural disasters such as earthquakes and hurricanes, as well as pandemics and other serious international threats to public health.On the recommendation of the European Council for Digital Services (“ the Digital Services Council ”), the Commission should be able to require providers of very large online platforms and providers of very large search engines to initiate a crisis response as a matter of urgency. Measures that those providers can identify and possibly apply may include, for example, adapting content modernization procedures and increasing the resources devoted to content moderation, adapting the general terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with reliable flaggers, taking awareness measures, promoting reliable information and adapting the design of their online interfaces. The necessary rules should be provided to ensure that such measures are taken within a very short timeframe, that the crisis response mechanism is used only when and to the extent strictly necessary, and that all measures taken under this mechanism are effective and proportionate, taking due account of the rights and legitimate interests of all concerned. Even without or before a crisis, the EU can impose crisis protocols that must be observed :taking due account of the rights and legitimate interests of all concerned. Even without or before a crisis, the EU can impose crisis protocols that must be observed :taking due account of the rights and legitimate interests of all concerned. Even without or before a crisis, the EU can impose crisis protocols that must be observed :
In addition to the crisis response mechanism for very large online platforms and very large online search engines, the Commission may take the initiative to draw up voluntary crisis protocols to ensure a rapid, coordinate collective and cross-border responses in the online environment. This may be the case, for example, when online platforms are misused for the rapid dissemination of illegal content or disinformation or when there is a need for the rapid dissemination of reliable information. In light of the important role of very large online platforms in the dissemination of information in our societies and across borders, providers of such platforms should be encouraged to draw up and apply specific crisis protocols.Such crisis protocols should only be activated for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance.
What does this mean in practice?
That the EU qualifies as a crisis for any matter it classifies (pandemic, climate, war …) and even without any crisis in very large online platforms and ditto search engines can and will command any information it considers illegal (read: to exclude any information that goes against a chosen narrative), to remove or make invisible the posts and to ban the profiles that disseminate such information, if they do not do this spontaneously (such as Meta, for example, which has spontaneously applied EU rules since October 2022, although they were not yet enforceable at the time and permanently shadowbanned millions of profiles and blocked every now and then).
Shitposting and blabla post that ridicule certain individuals or situations publicly is allowed, as long as they do not touch a specific narrative. Disseminating correct information that refutes a narrative or demonstrates its untruthful nature should not be, because labeled as ” illegal “
The very large online platforms and search engines must immediately comply with an EU order and take all appropriate measures under penalty of heavy fines:
Article 9: Order to act against illegal content
Upon receipt of an order issued by the competent national judicial or administrative authorities under applicable Union law or with national law in accordance with Union law for one or more to act in specific illegal content elements, brokering service providers shall promptly notify the issuing authority or other authority referred to in the order of the action taken on the order, stating whether, and if so when the order has been followed. What’s more, if the very large online platforms and search engines get or have knowledge of individuals who commit criminal offenses or publish illegal content, they must immediately report this to the EU so that it can take immediate action (blacklist, ban on all platforms …)
Article 18: Notification of suspected criminal offenses
- A hosting service provider who becomes aware of information that gives rise to a suspicion that it is a criminal offense, is or is likely to be committed threatening the life or safety of a person or persons, inform the law enforcement or judicial authorities of the Member State or Member States concerned without delay of its suspicion and provide all available relevant information. The very large online platforms and ditto online search engines MUST ban the profiles of users for a reasonable time. The reasonable time was not defined.
Article 23: Measures and protection against abuse
- Online platform providers shall suspend the provision of their services to service recipients who frequently provide manifestly illegal content for a reasonable period of time and after prior warning.
- Online platform providers shall suspend the processing of reports and complaints made through the reporting and action mechanisms referred to in Articles 16 and 20 and internal complaints for a reasonable period of time and after a prior warning systems have been submitted by persons or entities or by complainants who regularly report or file complaints that are manifestly unfounded.
There is no general obligation to monitor or actively investigate facts under the Regulation, but in order to be exempt from liability they must also remove all content as soon as they become aware of it.
In order to qualify for the liability exemption for hosting services, the provider must, when he actually has knowledge or awareness of illegal activities or illegal content, act quickly to remove that content or make it inaccessible.
Control by the EU
The EU provides very strong control over very large online platforms and ditto search engines. Having regard to the need to ensure control by independent experts, providers of very large online platforms and of very large online search engines should be accountable for compliance with the obligations laid down in this Regulation and, where appropriate, on any additional commitments they have made under codes of conduct and crisis protocols.
To ensure that controls are carried out in a timely and effective and efficient manner, providers of very large online platforms and of very large online search engines must provide the necessary cooperation and assistance to the organizations carrying out the checks, provide, inter alia, by the auditor access to all relevant data and buildings necessary to carry out the audit properly, including, where appropriate, data related to algorithmic systems, and by answering oral or written questions.
Auditors should also be able to use other objective sources of information, such as studies by recognized researchers.
Providers of very large online platforms and of very large online search engines should not hinder the performance of controls. Checks should be conducted according to best practices in the sector and with high professional ethics and objectivity, taking due account, where appropriate, of control standards and codes of conduct.
The EU has also provided for a detection system
Point 11: [of the Preamble, ed.]
In order to effectively enforce the obligations laid down in this Regulation, persons or representative organizations should be able to lodge a complaint of compliance with those obligations with the digital services coordinator in the territory where they have received the service, without prejudice to the provisions of this Regulation on the allocation of powers and without prejudice to the applicable rules for the handling of complaints in accordance with national principles of good administration.
The EU is taking all power and powers, which has the advantage that the very large online platforms and search engines know who their point of contact is.
Therefore, the Commission should have exclusive competence to monitor and enforce the additional obligation to manage systemic risks imposed by this Regulation on providers of very large online platforms and very large online search engines.
This regulation means the end of freedom of expression. No one will be able to freely express their opinion in matters that have been or are classified as illegal or harmful to public health, the climate by the EU, a war or any other social crisis. Even if there is no crisis. The contradictory debate is now excluded. Criticism of a policy is no longer possible. The EU determines what we can and cannot say.
But … Elon Musk is fully committed to safeguarding freedom of expression.