Most of us at Northwestern use Meta’s products every day – Facebook, Instagram, WhatsApp and maybe even virtual reality with the company’s recent acquisition of Oculus. Alarmingly, we know so little of what happens within this massive conglomerate.

Whistleblower Frances Haugen, a product manager (person in charge of identifying costumer needs from a product) at Meta of two years, recently spoke out against the company and tried to show what went on at Facebook behind closed doors. For months, she copied tens of thousands of pages of Facebook’s internal research after noticing unethical behavior at the company. In her 60 Minutes interview, Haugen said, “[morality] was substantially worse at Facebook than anything I’d seen before.”

After leaving Facebook/Meta in May 2021, Haugen found an avenue to release the evidence she had collected in the Wall Street Journal. Journalist Jeff Horowitz published the findings in an investigative exposé, the Facebook Files. She then filed a whistleblower claim with the Securities and Exchange Commission and testified before Congress on Oct. 5. These documents revealed a wide expanse of issues that Facebook had deliberately ignored, such as teen mental health problems, and that they have been intentionally driving up political polarization in an unethical bid to increase profits.


The daunting stack of tens of thousands of documents revealed how many problems Facebook knew their apps were causing. As a massive company, Facebook has many research teams that test their products in different ways to understand them better. The Facebook Files showed the extent of their research and how little they did with that information.

The leaks revealed the shocking extent to which Facebook understood its harm to teen girls caused by Instagram. A presentation made to Facebook’s message board stated that “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” Another document from the leak shows that more than one in five adolescent girls say Instagram typically made them feel worse about themselves.

This confirms that Facebook understands what public research has been saying about social media for years. A 2018 paper published by The Lancet found that there was a direct correlation between daily social media use and rates of self-image issues among adolescent girls in the UK.

Despite all this data showing the depths of harm done to young women, Meta CEO Mark Zuckerberg was still planning to make an Instagram-like app for children under 13 before the leaks surfaced. He even testified to Congress that “there is clearly a large number of people under the age of 13 who would want to use a service like Instagram,” while knowing the harm the app causes.

The story is the same with Facebook’s increasingly polarized content. In 2018, the service changed its algorithm to promote content that gets user interaction – comments, in particular. The Facebook algorithm is kept behind closed doors, meaning that no one could truly know what the changes meant or how they would manifest. When the changes were first announced, the Guardian reported that the “change worldwide would wreak havoc on the media ecosystem.” And they were correct.

The exposé shows that many European countries had written to Facebook about negative impacts on discourse from the 2018 algorithm change. Poland sent in a letter claiming that “the changes made political debate on the platform nastier.” Facebook researchers themselves noticed that one Spanish political party switched their messaging from only 50% negative to 80% negative to keep up with the changed algorithm. The USA, Taiwan, Spain and India were all found to have had significant political changes stemming from the 2018 algorithm change.

In another internal Facebook study, the company created a fake account named Carol. This fake account was a self-described “conservative mom” who liked several posts from Ivanka Trump and other mainstream Republican figures. Within just five days of creation, ‘Carol’ was recommended extremist content, from QAnon stories to the White Nationalist conspiracy theory of white genocide, among others.

This internal memo, called “Carol’s Journey into QAnon,” was released the summer of 2019, a year and a half before the QAnon movement crescendoed into the White House in the Jan. 6 Capitol Riots.

Why is this harm ignored?

Short answer: MSI. Long answer: meaningful social interactions. MSI is a data value that determines how many interactions you have with a given app in a given time frame. MSI gains points from Facebook’s preferred activities, such as ‘liking,’ ‘commenting’ and ‘sharing.’ This data collection method was launched in 2018 alongside the aforementioned algorithm changes. Increased MSI is directly equal to increased profit, because of Facebook’s advertising system that encourages longer user visits.

With an understanding of Meta’s profit system, we can begin to see why Facebook would not rush to stop harm to teen girls and continue to drive up polarization. “What’s super tragic is that Facebook’s own research says as these young women begin to consume this ‘eating disorder content,’ they get more and more depressed and actually makes them use the app more,” Haugen told 60 Minutes. Meta has no incentive to provide better mental health to its teenage users; in fact, it’s the opposite. It needs as many depressed users as possible to drive up MSI and profits.

According to the Facebook Files, when told that Facebook should reduce the amount of content concerning fashion, beauty and relationships to improve users mental health, one employee responded with, “Getting a peek at ‘the (very photogenic) life of the top 0.1%? Isn’t that the reason why teens are on the platform?’” All of this shows a drastic divide between user’s intention of using the platform to interact with their peers, and Meta’s intention of the platform to see the photogenic life of the top 0.1%.

Similarly, as political polarization and anger increase in a population, so too does MSI. In a 2012 paper, researchers Jonah Berger and Katherine L. Milkman found anger is the strongest emotion in eliciting virality. Positivity, sadness and surprise are among the least useful in creating viral content.

This loophole in human behavior is what has created Facebook’s desire to increase anger and polarization on their platform. As explained by CGP Grey’s prophetic 2015 YouTube video, “This Video Will Make You Angry,” when people get angrier, more articles are shared, fueling more anger and creating more charged content in a dangerous positive feedback loop.

As we spend less time interacting in-person and more of our social time is spent online, we grow lonelier. This is consistent with the findings of many studies, including “Less in-person social interaction with peers,” by Jean M. Twenge and others, which found an incredibly strong correlation with the rise of the internet and social media and our collective loneliness. As psychologist Susan Pinker said, “Face-to-face contact releases a whole cascade of neurotransmitters and, like a vaccine, they protect you now, in the present, and well into the future.”

Moving forward

Zuckerberg responded to the exposé in a Facebook earnings call: “What we are seeing is a coordinated effort to selectively use leaked documents to paint a false picture of our company.” Still, the data shown in the leaked documents paints a clear picture of a company that purposefully increased anger on its platform in order to drive up MSI and profits.

Fortunately for us, regulation of this massive tech company is one of few things Republicans and Democrats can seem to agree on. Sen Richard Blumenthal (D-CT) said in a press conference, “Every part of the country has the harms that are inflicted by Facebook and Instagram.”

With the increase of transparency that has come from the Facebook Files, we are better able to see the problems that face us and, hopefully, lawmakers will know what areas require protective legislation.

Article thumbnail:  "Facebook logo" by Simon Steinberger licensed for use by Pixabay.