Instagram is showing more 'eating disorder-adjacent' content to vulnerable teens, internal Meta research shows

* Teens who reported feeling bad about their bodies saw more ‘eating disorder’ content * Posts saw breasts, butt or thighs and “explicit judgment” about body types * Meta said the research demonstrates its commitment to understanding its products By Jeff Horwitz NEW YORK Oct 20 (Reuters) – Meta researchers found that teens who frequently made themselves feel bad saw Instagram disorder. adjacent content” than those who did not, according to an internal document reviewed by Reuters. The posts shown to those users featured “prominent display” of breasts, buttocks or thighs, “explicit judgments” about body types and “content related to disordered eating and/or negative body image”. While such material is not banned on Instagram, the researchers noted that parents, teenagers and outside experts told Meta they believe it is potentially harmful to young users. Over the course of the academic year 2023-2024, Meta surveyed 1,149 teenagers whether and how often they felt bad about their bodies after using Instagram. Then they manually sampled the content the users saw on the platform over a three-month period. The study showed that for the 223 teenagers who often felt bad about their bodies after they Instagram viewed, “eating disorder-related content” made up 10.5% of what they saw on the platform. Among the other teens in the study, such content made up only 3.3% of what they saw. “Teens who reported frequent body dissatisfaction after viewing posts on Instagram … saw approximately three times more body-focused/ED-adjacent content than other teens,” the authors wrote, in reference to eating disorders, according to a summary of the research reviewed exclusively by Reuters. In addition to seeing more eating disorder-adjacent content, researchers found, the teens who reported the most negative feelings about themselves saw more provocative content broadly, content Meta classifies as “mature themes,” “Risk-taking behavior,” “Harm and cruelty” and “Suffering.” Cumulative was such content accounted for 27% of what those teens saw on the platform, compared to 13.6% among their peers who did not report negative feelings. META RESEARCHERS EXPRESSED CONCERN The researchers emphasized that their findings did not prove that Instagram makes users feel worse about their bodies. “It is not possible to determine the causal direction of these findings,” they wrote, noting that teenagers who feel bad about themselves, can actively seek out that material. The Instagram research, a redacted replica of which is published here and has not been previously reported, showed that Instagram exposed teens “likely to report experiencing body dissatisfaction” to high doses of content that Meta’s own advisors “expressed support for limiting,” the write-up said. In a statement, Meta spokesperson Andy Stone said the document published by reviewed by Reuters demonstrates Meta’s commitment to understanding and improving its products. “This research is further evidence that we are committed to understanding young people’s experiences and using those insights to build safer, more supportive platforms for teens,” Stone said, noting the company’s recent announcement that it will seek to show content to minors in accordance with PG-13 film standards. In the study Meta said its existing screening tools – designed to detect violations of platform rules – were unable to detect 98.5% of “sensitive” content that the company believed might not be suitable for teenagers. The finding was “not necessarily surprising,” the researchers wrote, because Meta had only recently begun work on building an algorithm to detect the potentially harmful content they investigation. ‘CONNECTION’ BETWEEN CONTENT AND ‘FEELING WORSE ABOUT ONE’S BODY’ The study, which is labeled “Do not distribute internally or externally without permission,” is the latest internal research to demonstrate an “association” between viewing fashion, beauty and fitness content and “reporting feeling worse about one’s body,” the document says. The research was conducted as part of Meta’s efforts to understand interactions between users and its own products, which rely on algorithms to determine what content to show users. In the United States, the company has faced state and federal investigations into Instagram’s effects on children as well as civil lawsuits by school districts alleging and misleading marketing of its platforms as safe for teens. Those suits prominently cited previous leaked internal research from Meta in which researchers expressed their belief that the platform’s content recommendations may be harmful to youth with existing body image issues. Since July of this year, Stone said, the company has cut in half the amount of age-restricted content shown to teenage Instagram users. CONTENT WARRANTED SNOOKER WARNING Jenny Radesky, an associate professor of pediatrics at the University of Michigan who reviewed Meta’s unreleased research at Reuters’ request, said the called the study’s methodology robust and its findings disturbing. “This supports the idea that teenagers with psychological vulnerabilities are being profiled by Instagram and fed more harmful content,” Radesky said. “We know that a lot of what people consume on social media comes from the feed, not from search.” Previous internal research at Meta “showed a link between the frequency of reporting feeling worse about one’s body” and the consumption of Instagram fitness and beauty content, the write-up reads. Meta’s researchers wrote that teens, parents, pediatricians, outside advisors and Meta’s own Eating Disorders and Body Image Advisory Board urged Instagram to limit how much of that content it shows to teens, warning that it “could be detrimental to teen well-being, specifically by causing or exacerbating feelings of body satisfaction.” Such content violates Meta’s rules against content that openly promotes eating disorders. The report included examples of problematic content, including images of thin women in underwear and bikinis, fight videos and a drawing of a crying figure scrawled with phrases including “how can I ever compare” and “let it all end.” One of the sample posts showed a close-up of a woman’s lacerated neck. Although the content did not violate Meta’s rules, the researchers found it disturbing enough that they issued a “sensitive content” warning to colleagues reading their work. (Reporting by Jeff Horwitz in Oakland, Calif. Editing by Kenneth Li and Michael Learmonth)