A chilling new report has revealed that YouTube’s recommendation algorithm may be driving young girls toward harmful content glorifying weight loss and eating disorders. The platform, which boasts over 2 billion monthly users, is under scrutiny for promoting videos that perpetuate unhealthy body image ideals and dangerous behaviors.
Anna’s Story: A Disturbing Trend
In 2020, Anna Mockel was a 14-year-old spending countless hours on YouTube. Initially, she explored various types of content, but soon, the algorithm began flooding her feed with videos featuring extremely thin girls discussing weight loss. She became fixated on these videos, many of which glorified consuming only 500 calories daily—far below the recommended 2,200 calories for adolescent girls.
Anna’s exposure to this content led her to adopt restrictive diets, eventually resulting in a diagnosis of anorexia. “I didn’t realize it was a problem,” Anna said. “So many others were doing it. Social media, especially YouTube, made it seem normal.” She blames the platform for normalizing eating disorders, leaving her vulnerable to its dangerous influence.
The Report’s Alarming Findings
The Center for Countering Digital Hate (CCDH) conducted an investigation, creating a profile to simulate a 13-year-old girl. After watching weight loss or diet-related content, they analyzed over 1,000 videos recommended by YouTube. The results were staggering:
- **638 videos encouraged weight loss and eating disorders.**
- **344 videos contained harmful content.**
- **50 videos referenced self-harm or suicide.**
Many of these videos have amassed significant traction, with average view counts of 344,000—nearly 60 times higher than the platform’s average. Shockingly, these videos are often monetized, featuring ads from prominent brands like Grammarly, Nike, and T-Mobile, raising questions about advertisers' awareness and accountability.
Violating Policies, Ignoring Consequences
Despite Google’s policies against promoting harmful content, YouTube’s algorithms continue to recommend videos that glorify disordered eating and self-harm. Lawsuits against the platform highlight its role in spreading content detrimental to mental health.
When questioned, YouTube declined to comment directly but stated that it collaborates with mental health experts to refine its recommendations for teens. Although some harmful keywords have been blocked, researchers found that content promoting eating disorders still appears in YouTube’s “Up Next” panel.
A Double-Edged Sword
While YouTube has faced criticism for its role in amplifying harmful content, the platform also holds potential as a positive resource. By curating credible health advice, expert-led discussions, and mental health support, YouTube could become a tool for spreading awareness and combating misinformation.
However, this depends on users’ ability to critically evaluate sources and avoid algorithm-driven traps. Platforms like YouTube must prioritize ethical algorithm design to ensure vulnerable users are protected from harmful content.
The Call for Action
The findings underscore the urgent need for stricter regulation of social media algorithms, greater transparency from tech companies, and accountability from advertisers funding harmful content. For YouTube, the challenge lies in balancing user engagement with responsible content promotion.
Until then, young users and their families must remain vigilant, educating themselves about the risks of algorithm-driven recommendations and fostering healthier, more informed online habits.