11 C
Thursday, May 30, 2024

TikTok Under EU Investigation for Child Protection

Must read

The European Commission opened formal proceedings against TikTok under the Digital Services Act on Monday (19 February) due to possible breaches in several areas, including child protection.

The Digital Services Act (DSA), which entered into force on 17 February for all platforms operating in the EU, is a horizontal legislation regulating how online actors should deal with illegal and harmful content.

The current decision results from a preliminary investigation based on a risk assessment report by TikTok sent to the Commission in September last year and on the EU institution’s formal Requests for Information on illegal content, protection of minors, and data access.

As European Commissioner for Internal Market Thierry Breton also pointed out in a post on X, the current investigation will focus on the possible breach in transparency and protection of minors, and within that, on addictive design, screen time limits, rabbit hole effect, age verification, and default privacy settings.

“TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with,” a TikTok spokesperson told Euractiv.

“We’ll continue to work with experts and industry to keep young people on TikTok safe, and look forward to now having the opportunity to explain this work in detail to the Commission,” the spokesperson added.

The Commission noted that compliance with the DSA includes assessing and mitigating systemic risks; however, existing mitigation measures, like TikTok’s age verification tools, may not be deemed reasonable, proportionate, or fully effective in addressing these concerns.

The Commission also points out that the DSA requirements involve implementing suitable and balanced measures to guarantee minors’ privacy, safety, and security, including default privacy settings tailored for minors within the design and functioning of their recommender systems. Ensuring adherence to the regulation’s requirements also entails establishing a searchable and dependable repository for advertisements featured on TikTok.

TikTok’s efforts to enhance platform transparency are also under scrutiny. The inquiry focuses on potential deficiencies in granting researchers access to the platform’s publicly available data, which is also required under the DSA.

Addictive design

Social media platforms have been criticized for being addictive by design to make people spend as much time on them as possible, with such features as the rabbit hole effect, now mentioned by Breton, which refers to algorithms showing users more of a specific type of content, the more they interact with it, for example by liking it, or even by spending more time looking at a type of content than another one.

On 12 December last year, the European Parliament adopted with a broad majority the initiative to make digital platforms less addictive at its plenary session in Strasbourg.

Child protection

According to the DSA’s rules, online platforms used by more than 45 million users per month entail a ‘systemic risk’ for society; hence, they must follow a specific regime of content moderation, including transparency and risk management obligations.

Last February, the EU executive announced the first batch of very large online platforms (VLOPs) and large search engines (VLOSEs) in April, which was updated at the end of December.

The list includes social media platforms like Meta’s Instagram and Facebook, TikTok and X.

In the December update of the list, when three pornography platforms, XVideos, Pornhub, and Stripchat, were added, already both Breton and Executive Vice-President in charge of competition Margrethe Vestager emphasized the protection of minors as one of the priorities under the DSA.

Now, according to a document seen by Reuters, Breton said, “The protection of minors is a top enforcement priority for the DSA.”

“As a platform that reaches millions of children and teenagers, TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online,” the Commissioner added.

Next steps

The Commission will now carry out an in-depth investigation by continuing to gather evidence that could prove that TikTok committed infringements under the DSA.

By opening a formal investigation, the Commission can enforce, for example, interim measures and non-compliance decisions or accept TikTok’s remedies if offered.

Such investigations’ timelines can depend on several factors, so the Commission has no current deadline to wrap up the proceeding.

Meanwhile, on 7 February, Meta and TikTok also confirmed they are suing the European Commission over an annual supervisory fee that companies listed under the DSA must pay.

More articles

Latest news