October 5, 2021

Social Media and Mental Health


KEY TAKEAWAYS

  • Social media use can lead to severe mental health problems, including sleep disorders, depression, and suicide.
  • Internal research conducted by Facebook shows that its Instagram product may be particularly harmful, especially for young people.
  • Senators from both parties have expressed concern with the harmful effects of social media on mental health, and the Commerce Committee is investigating Facebook’s research.

Social media has transformed many people’s lives in profound ways. Some of its effects have been positive, such as allowing friends and family to stay connected, increasing access to information, and enabling people to exercise their free-speech rights and participate in public discourse. But there are dark sides to social media as well. The damage that social media can do to people’s mental health, particularly teenagers, is increasingly a concern for parents and policymakers. Research has shown that using social media can lead to sleep disorders, depression, and suicidal thoughts and actions.

An addiction machine

Social media companies use artificial intelligence to determine people’s interests and desires, and then they feed users content that fulfills those desires. Experts say this can be particularly problematic for adolescents, who may lack the self-discipline and maturity needed to stop watching the content.

Research has suggested that some people experience addiction to social media in ways that are similar to addiction to drugs and other substances. According to one British study, these include “neglect of personal life, mental preoccupation, escapism, mood modifying experiences, tolerance, and concealing the addictive behavior.” People who stop using social media can also appear to suffer psychological symptoms of withdrawal, a common occurrence for drug and other addictions.

One study found excessive use of social media, particularly features like “likes” and “comments,” can activate release of dopamine, sometimes called the “pleasure chemical,” similar to opioids or cocaine. Studies have also found scrolling through a Facebook feed can produce reactions similar to those experienced through cocaine use or gambling. This addiction can have severe consequences. A decade-long study found that as the time teen girls spent on social media goes up, so does their long-term risk of suicide.

One former Facebook executive, who quit the company and doesn’t allow his children to use social media, has said, “the short-term, dopamine-driven feedback loops that we have created are destroying how society works.”

instagram in the spotlight

Internal company documents reported by the Wall Street Journal show Facebook has been aware since at least 2018 of the harmful effects of some of its products, including the photo and video sharing app Instagram.

Instagram by the Numbers

SOCIAL MEDIA AND MENTAL HEALTH

Facebook’s research found that the nature and design of Instagram can be especially harmful to mental health. Users tend to share only the best moments of their lives, so a teenager watching other people’s content can get a false sense that everyone else’s life is perfect. One teen in a Facebook focus group reportedly told the company, “After looking through photos on Instagram, I feel like I am too big and not pretty enough. It makes me feel insecure about my body even though I know I am skinny.” If a teen uses the site to search for workouts, she may be bombarded ever after with photos of what her body should look like. This can lead young people to develop eating disorders and depression. Senator Richard Blumenthal’s staff set up an Instagram account that posed as a 13-year-old girl to test this proposition. They found that after following “easily findable” accounts associated with eating disorders, “within a day” Instagram’s algorithm begin to serve up content promoting eating disorders and self-harm. As Facebook’s research stated, “Aspects of Instagram exacerbate each other to create a perfect storm.”


Facebook’s internal research also found that teens often were aware of the negative effects of Instagram, and wanted to spend less time on the platform, but may have been unable to do so. A psychology professor quoted in the Wall Street Journal likened the harm to “clinical-level depression.” The documents also quoted a researcher who reported, “Teens told us that they don’t like the amount of time they spend on the app but feel like they have to be present. They often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves.”

SOCIAL MEDIA AND MENTAL HEALTH

Some have likened Facebook’s failure to disclose what it knew about the harmful effect of its platforms on mental health for teens to Big Tobacco hiding what it knew about the harmful and addictive nature of cigarettes. As one researcher quoted in the Journal put it, “If you believe that R.J. Reynolds should have been more truthful about the link between smoking and lung cancer, then you should probably believe that Facebook should be more upfront about links to depression among teen girls.”

These issues intersect with the Federal Trade Commission’s antitrust lawsuit against Facebook. In antitrust law, consumer welfare is an important factor in determining whether a monopoly is illegal. A benign monopoly does not violate antitrust law. But Facebook’s defense of its market share could be compromised if courts found that consumers are not benefitting from, or are even being harmed by, what Facebook does with its power.

Facebook has published multiple blog posts in response to the Journal’s reporting. The first, entitled “What the Wall Street Journal Got Wrong” claims the stories contain “deliberate mischaracterizations” and “conferred egregiously false motives to Facebook’s leadership and employees.” Another post focuses on steps Facebook has taken to improve safety and security on its platforms, including efforts to combat misinformation and tools to help users track and manage how much time they spend on Facebook. A third post contends that, “contrary to The Wall Street Journal’s characterization, Instagram’s research shows that on 11 of 12 well-being issues, teenage girls who said they struggled with those difficult issues also said that Instagram made them better rather than worse.”

Facebook has also published slide decks that much of the Journal’s reporting was based on, along with “annotations” to each slide. The Journal has also posted additional slide decks, including one entitled “Teen Girls Body Image and Social Comparison on Instagram.” The deck shows that Facebook’s internal study found “social comparison is worse on Instagram” than on other social media platforms like TikTok or Snapchat, and “66% of teen girls on Instagram experience negative social comparison.”

On September 27, Facebook announced it was “pausing” its plans build an “Instagram Kids” in order to “work with parents, experts, policymakers, and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today.”

Congressional Action

Senators Blumenthal and Marsha Blackburn, the chairman and ranking member on the Commerce Committee’s subcommittee on Consumer Protection, Product Safety, and Data Security, have launched an investigation into Facebook’s research. They wrote: “It is clear that Facebook is incapable of holding itself accountable. … When given the opportunity to come clean to us about their knowledge of Instagram’s impact on young users, Facebook provided evasive answers that were misleading and covered up clear evidence of significant harm.”

The subcommittee held a hearing on September 30 entitled “Protecting Kids Online: Facebook, Instagram, and Mental Health Harms.” Senators from both sides of the aisle were highly critical of Facebook. Senator Blackburn asked Facebook to disclose all of its internal research. The company refused, claiming that some of the research was not relevant to the committee’s investigation.

The subcommittee held a follow-up hearing on October 5 with testimony from Frances Haugen, the whistleblower who provided the Journal with the internal Facebook research. Ms. Haugen is a former Facebook product manager who worked on the civic integrity team in the runup to the 2020 election. She advocated for outside researchers to be able to examine Facebook’s internal research and for the company to move from “engagement based rankings” to a simpler, less algorithm-driven system to display content, such as chronologically. She also urged Congress to create an oversight body for social media companies.      

At a Judiciary subcommittee hearing on September 21, Senator Josh Hawley questioned Facebook’s vice president of privacy and public policy about the safety of platforms like Instagram for teenage users. The company declined to release its internal research to Congress and, when asked if the platforms were safe for teenagers, said, “we’re working really hard to make that the case.” At the hearing, Senator Mike Lee criticized the company, saying its behavior “displays a reckless disregard for its consumers.”

A House subcommittee held a hearing on September 28 examining outside researchers’ ability to access and analyze data from social media companies. Last year, citing privacy concerns, Facebook revoked access to its platform from researchers at New York University. One of the NYU researchers testified at the hearing, and urged Congress “to ensure that researchers, journalists, and the public have access to the data we need to study online misinformation and build real solutions.”

Issue Tags: Technology, Health Care