Italian Authorities Fine TikTok Following Investigation into Consumer Safety of ‘French Scar’ Trend

Italy’s competition and consumer authority, the AGCM, has fined TikTok €10 million (almost $11M) following a probe into algorithmic safety concerns. Moreover, this content is systematically re-proposed to users as a result of their algorithmic profiling, stimulating an ever-increasing use of the social network,” the AGCM wrote. The authority said its investigation confirmed TikTok’s responsibility in disseminating content “likely to threaten the psycho-physical safety of users, especially if minor and vulnerable”, such as videos related to the “French scar” challenge. One notable change as a result of the DSA is TikTok offering users non-profiling based feeds. TikTok also faces the possibility of increasing regulation by Member State level agencies applying the bloc’s Audiovisual Media Services Directive.

Italy Fines TikTok €10 Million for “French Scar” Challenge and Algorithmic Safety Concerns

The Italian competition and consumer authority, the AGCM, has imposed a fine of €10 million on TikTok following an investigation into algorithmic safety concerns and the spread of harmful content on the platform.

“The company has failed to implement appropriate mechanisms to monitor content published on the platform, particularly those that may threaten the safety of minors and vulnerable individuals.”

“Moreover, this content is systematically re-proposed to users as a result of their algorithmic profiling, stimulating an ever-increasing use of the social network.”

The probe was initiated last year after reports of a challenge called the “French scar” in which users shared videos of marks on their faces made by pinching their skin. The AGCM said three regional companies in the ByteDance group, including Ireland-based TikTok Technology Limited, TikTok Information Technologies UK Limited, and TikTok Italy Srl, were found to have engaged in “unfair commercial practice”.

“The investigation confirmed TikTok’s responsibility in disseminating potentially dangerous content, especially to minors and vulnerable individuals.”

“This content, such as videos related to the ‘French scar’ challenge, is not adequately monitored and measures to prevent its spread are insufficient.”

The AGCM also criticized TikTok for not complying with its own platform guidelines, noting that they are applied without considering the specific vulnerability of adolescents. It highlighted the potential negative impact on young people, whose brains are still developing, and who may be influenced by peer pressure and the desire to fit in. The authority specifically called out TikTok’s recommendation system, which uses algorithmic profiling to determine what content is shown to users.

TikTok Disputes Fine

Responding to the AGCM’s decision to issue a penalty, TikTok released a statement downplaying the algorithmic risks posed to minors and vulnerable individuals, stating:

“The so-called ‘French Scar’ content averaged just 100 daily searches in Italy before the AGCM’s announcement last year. We have since restricted its visibility to users under 18 and made it ineligible for the ‘For You’ feed.”

While the enforcement action was limited to Italy, the European Commission is responsible for overseeing TikTok’s compliance with algorithmic accountability and transparency provisions under the pan-European Digital Services Act (DSA). Non-compliance can result in penalties of up to 6% of global annual turnover. TikTok, classified as a “very large platform” under the DSA, is expected to comply by late summer.

One change as a result of the DSA is the introduction of non-profiling based feeds on TikTok. However, these alternative feeds are not the default setting, meaning users are still subject to AI-based tracking and profiling unless they actively opt out of it.

Last month, the EU opened a formal investigation into TikTok for its addictive design, harmful content, and the protection of minors. This investigation is ongoing. The platform has stated that it is looking forward to explaining its approach to safeguarding minors to the Commission.

Previous Troubles for TikTok

TikTok has faced previous concerns about child safety from regional enforcers. Last year, the Italian data protection authority issued a child safeguarding intervention, and the platform was fined €345 million for data protection failures related to minors. Consumer protection groups have also voiced concerns about the potential dangers of profiling on minors and vulnerable individuals.

In addition, Member State level agencies may impose regulations under the EU’s Audiovisual Media Services Directive. For example, Ireland’s Coimisiún na Meán is considering rules that would require algorithmic recommender systems to be turned off by default on video sharing platforms like TikTok.

The situation is also tense for TikTok in the United States, where lawmakers have proposed a bill to ban the platform unless it cuts ties with its Chinese parent company, ByteDance. National security concerns have been raised, along with fears of potential manipulation of Americans through the platform’s tracking and profiling capabilities.

Avatar photo
Zara Khan

Zara Khan is a seasoned investigative journalist with a focus on social justice issues. She has won numerous awards for her groundbreaking reporting and has a reputation for fearlessly exposing wrongdoing.

Articles: 847

Leave a Reply

Your email address will not be published. Required fields are marked *