YouTube will now limit repeated recommendations of videos about ideal standards for teens, who the platform said are susceptible to forming negative beliefs about themselves after their repeated viewing.
The company’s Youth and Families Advisory Committee, a team of independent experts in child development, digital learning, children’s media and more, helped YouTube identify categories of content that may be harmless as a single video but could be problematic for some teens if viewed repeatedly.
That includes content that compares physical features and idealizes some types over others, idealizes specific fitness levels or body weights, or displays social aggression in the form of non-contact fights and intimidation.
Community Guidelines will continue to be enforced to remove content and prevent minors from seeing videos that violate policies on child safety, eating disorders, hate speech and harassment, according to a company announcement.
“A higher frequency of content that idealizes unhealthy standards or behaviors can emphasize potentially problematic messages,” Allison Briscoe-Smith, a clinician, researcher and member of the Youth and Families Advisory Committee, said in a press release. “Those messages can impact how some teens see themselves. Guardrails can help teens maintain healthy patterns as they naturally compare themselves to others and size up how they want to show up in the world.”
Big tech companies, including YouTube, have come under fire in recent years for permitting the proliferation of certain content that a growing body of evidence links to worse teen mental health. President Biden has urged Congress to hold social platforms accountable. Some public schools are suing TikTok, Instagram, Facebook, Snapchat and YouTube, accusing them of increasing students’ anxiety and depression. They’re not alone; hundreds of similar lawsuits have been filed around the country alleging the same thing.
In its latest announcement, YouTube is also updating a few existing products to make them more relevant for teens. Its Take a Break and Bedtime reminders, existing since 2018, will be more visually prominent and appear more frequently, especially for minors whose accounts have the reminders on by default.
Crisis resource panels will also be expanded into a new full-page experience to help viewers explore help topics when they search YouTube related to suicide, self-harm and eating disorders. Viewers of all ages will more prominently see resources for third-party crisis hotlines and suggested prompts to steer search queries toward topics like “self-compassion” or “grounding exercises.”
YouTube is also working with the World Health Organization and Common Sense Networks to develop public, industry-wide resources related to teens and online well-being. Working together, the partners are developing new educational resources for parents and teens about content creation online. YouTube is also supporting the WHO and the British Medical Journal to develop a report to provide information to help creators better develop age-appropriate mental health resources and provide tips on enriching mental health content. The report is expected to be published in early 2024.