Brussels – The European Commission is serious and is launching an investigation into TikTok for possible violations of the Digital Services Act. The news came today (February 19) with a post on X from Internal Market Commissioner Thierry Breton, just two days after the the entry into full force of the new Digital Services Act. After a preliminary investigation that lasted months in the fields of child protection, advertising transparency, data access for researchers, and managing the risk of creating addiction and harmful content, the EU executive decided to put TikTok under the lens for possible sanctions under the DSA.
The EU Commission announced that the proceedings will focus in several areas, primarily on compliance with the Digital Services Act’s obligations on assessment and mitigation of “actual or foreseeable negative effects stemming from the design of TikTok’s system, including algorithmic systems, that may stimulate behavioral addictions and/ or create so-called ‘rabbit hole effects’. (i.e., continuous and impulsive viewing of videos and content). “Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalisation processes,” the Commission points out, referring to age verification tools – to prevent minors from accessing inappropriate content – that “may not be reasonable, proportionate and effective.“
Also with regard to compliance with the new digital services law, the EU executive is also concerned about possible non-compliance with “appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.” Preliminary investigations — based on the risk assessment report sent in September last year — also found deficiencies on the level of setting up a “searchable and reliable repository for advertisements” presented on TikTok, on the transparency of the platform, and the difficulties for researchers to have access to publicly accessible data (as required by Article 40 of the DSA).
After the formal initiation of the proceeding, the Commission will continue to gather evidence – such as by sending out additional requests for information and conducting interviews or inspections- and, in the meantime, may take further enforcement measures, such as provisional measures and non-compliance decisions. At the same time, TikTok can propose commitments “to remedy the matters that are the subject of the proceeding,” which the Commission services may evaluate and accept. The ongoing investigation “is without prejudice to any other proceedings” on parallel issues that may constitute violations of the Digital Services Act, including those related to obligations on countering the dissemination of illegal content (terrorist or online child sexual abuse).
English version by the Translation Service of Withub