Italy Fines TikTok €10 Million for “French Scar” Challenge and Algorithmic Safety Concerns
The Italian competition and consumer authority, the AGCM, has imposed a fine of €10 million on TikTok following an investigation into algorithmic safety concerns and the spread of harmful content on the platform.
“The company has failed to implement appropriate mechanisms to monitor content published on the platform, particularly those that may threaten the safety of minors and vulnerable individuals.”
“Moreover, this content is systematically re-proposed to users as a result of their algorithmic profiling, stimulating an ever-increasing use of the social network.”
The probe was initiated last year after reports of a challenge called the “French scar” in which users shared videos of marks on their faces made by pinching their skin. The AGCM said three regional companies in the ByteDance group, including Ireland-based TikTok Technology Limited, TikTok Information Technologies UK Limited, and TikTok Italy Srl, were found to have engaged in “unfair commercial practice”.
“The investigation confirmed TikTok’s responsibility in disseminating potentially dangerous content, especially to minors and vulnerable individuals.”
“This content, such as videos related to the ‘French scar’ challenge, is not adequately monitored and measures to prevent its spread are insufficient.”
The AGCM also criticized TikTok for not complying with its own platform guidelines, noting that they are applied without considering the specific vulnerability of adolescents. It highlighted the potential negative impact on young people, whose brains are still developing, and who may be influenced by peer pressure and the desire to fit in. The authority specifically called out TikTok’s recommendation system, which uses algorithmic profiling to determine what content is shown to users.
TikTok Disputes Fine
Responding to the AGCM’s decision to issue a penalty, TikTok released a statement downplaying the algorithmic risks posed to minors and vulnerable individuals, stating:
“The so-called ‘French Scar’ content averaged just 100 daily searches in Italy before the AGCM’s announcement last year. We have since restricted its visibility to users under 18 and made it ineligible for the ‘For You’ feed.”
While the enforcement action was limited to Italy, the European Commission is responsible for overseeing TikTok’s compliance with algorithmic accountability and transparency provisions under the pan-European Digital Services Act (DSA). Non-compliance can result in penalties of up to 6% of global annual turnover. TikTok, classified as a “very large platform” under the DSA, is expected to comply by late summer.
One change as a result of the DSA is the introduction of non-profiling based feeds on TikTok. However, these alternative feeds are not the default setting, meaning users are still subject to AI-based tracking and profiling unless they actively opt out of it.
Last month, the EU opened a formal investigation into TikTok for its addictive design, harmful content, and the protection of minors. This investigation is ongoing. The platform has stated that it is looking forward to explaining its approach to safeguarding minors to the Commission.
Previous Troubles for TikTok
TikTok has faced previous concerns about child safety from regional enforcers. Last year, the Italian data protection authority issued a child safeguarding intervention, and the platform was fined €345 million for data protection failures related to minors. Consumer protection groups have also voiced concerns about the potential dangers of profiling on minors and vulnerable individuals.
In addition, Member State level agencies may impose regulations under the EU’s Audiovisual Media Services Directive. For example, Ireland’s Coimisiún na Meán is considering rules that would require algorithmic recommender systems to be turned off by default on video sharing platforms like TikTok.
The situation is also tense for TikTok in the United States, where lawmakers have proposed a bill to ban the platform unless it cuts ties with its Chinese parent company, ByteDance. National security concerns have been raised, along with fears of potential manipulation of Americans through the platform’s tracking and profiling capabilities.