In a report published Tuesday, the French branch of the NGO Amnesty International denounced the "spiral" effect of the algorithm of the social network TikTok, accused of amplifying exposure to content related to suicide or self-harm, and announced that it was referring the matter to Arcom, the audiovisual and digital regulator.
"Amnesty International France has decided to refer the matter to Arcom to file a complaint under the DSA (Digital Services Act, the European regulation on digital services, editor's note) against TikTok for failing to meet its obligations," Katia Roux, advocacy officer for Amnesty International France, told the press.
According to the NGO's findings, adolescents "showing an interest in content relating to sadness or psychological distress" are directed towards "depressive content" within an hour.
After publishing a report on the social network's algorithm in 2023, Amnesty conducted new experiments in France.
This report "provides new evidence of how TikTok exposes young people on its platform to content that can be harmful, that can normalize, trivialize, or even idealize depression, self-harm or suicide," Roux said.
Asked by AFP, the social network estimated that "without taking into account how real people use TikTok, this 'experiment' was designed to achieve a predetermined result."
– “Suicidal thoughts” –
The NGO created three fake profiles of 13-year-olds on TikTok and had them scroll through the personalized feed, dubbed "For You," to watch several hours of content evoking "sadness or mental health issues."
"Within 15 to 20 minutes of the experiment beginning, all three feeds contained almost exclusively mental health videos, with up to half featuring sad and depressive content. On two of the three accounts, videos expressing suicidal thoughts appeared within 45 minutes," the report said.
Twelve automated accounts were then created, in conjunction with the Algorithmic Transparency Institute association, integrating the history of the first three accounts.
The NGO noted an increase in content on mental health, although less significant than on manually managed accounts.

"TikTok has not taken adequate measures to identify and prevent the risks to which the platform exposes young people," said Katia Roux, pointing to a failure to comply with the obligations imposed by the DSA since August 2023.
"We truly hope that (...) this new evidence will be examined and taken into account to advance the investigation opened by the European Commission," she added.
In February 2024, the European Commission opened an investigation into TikTok for alleged failures to protect minors.
TikTok has assured that it is "proactively providing a safe and age-appropriate experience for teens." "Nine out of ten videos that violate our rules are removed before they are even viewed," the social network insisted.

