Skip to Content

Jaw-dropping moments in WSJ’s bombshell Facebook investigation

<i></i><br/>
KMIZ

By Allison Morrow, CNN Business

This week the Wall Street Journal released a series of scathing articles about Facebook, citing leaked internal documents that detail in remarkably frank terms how the company is not only well aware of its platforms’ negative effects on users but also how it has repeatedly failed to address them.

There’s a lot to unpack from the Journal’s investigation. But one thing that stands out is just how blatantly Facebook’s problems are documented, using the kind of simple, observational prose not often found in internal communications at multinational corporations.

Here are some of the more jaw-dropping moments from the Journal’s series.

‘We make body issues worse…’

In the Journal’s report on Instagram’s impact on teens, it cites Facebook’s own researchers’ slide deck, stating the app harms mental health.

“We make body image issues worse for one in three teen girls,” said one slide from 2019, according to the WSJ.

Another reads: “Teens blame Instagram for increases in the rate of anxiety and depression … This reaction was unprompted and consistent across all groups.”

Those slides are particularly notable because Facebook has often referenced external studies, rather than its own researchers’ findings, in arguing that there’s little correlation between social media use and depression.

Karina Newton, head of public policy at Instagram, addressed the WSJ story Tuesday, saying that while Instagram can be a place where users have “negative experiences,” the app also gives a voice to marginalized people and helps friends and family stay connected. Newton said that Facebook’s internal research demonstrated the company’s commitment to “understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues.”

‘We are not actually doing what we say we do publicly’

Facebook CEO Mark Zuckerberg has repeatedly, publicly maintained that Facebook is a neutral platform that puts its billions of users on equal footing. But in another report on the company’s “whitelisting” practice — a policy that allows politicians, celebrities and other public figures to flout the platform’s rules — the WSJ found a 2019 internal review that called Facebook out for misrepresenting itself in public.

“We are not actually doing what we say we do publicly,” the review said, according to the paper. “Unlike the rest of our community, these people” — those on the whitelist — “can violate our standards without any consequences.”

Facebook spokesman Andy Stone told the Journal that criticism of the practice was fair, but that it “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.”

‘Misinformation, toxicity and violent content’

In 2018, Zuckerberg said a change in Facebook’s algorithm was intended to improve interactions among friends and family and reduce the amount of professionally produced content in their feeds. But according to the documents published by the Journal, staffers warned the change was having the opposite effect: Facebook was becoming an angrier place.

A team of data scientists put it bluntly: “Misinformation, toxicity and violent content are inordinately prevalent among reshares,” they said, according to the Journal’s report.

“Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” the scientists wrote. “This is an increasing liability,” one of them wrote in a later memo cited by WSJ.

The following year, the problem persisted. One Facebook data scientist, according to the WSJ, wrote in an internal memo in 2019: “While the FB platform offers people the opportunity to connect, share and engage, an unfortunate side effect is that harmful and misinformative content can go viral, often before we can catch it and mitigate its effects.”

Lars Backstrom, a Facebook vice president of engineering, told the Journal in an interview that “like any optimization, there’s going to be some ways that it gets exploited or taken advantage of …That’s why we have an integrity team that is trying to track those down and figure out how to mitigate them as efficiently as possible.”

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Business/Consumer

Jump to comments ↓

CNN

BE PART OF THE CONVERSATION

ABC 17 News is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content