Be careful who you trust, gentle internet user.
If Facebook’s Wednesday report on “Widely Viewed Content” is to be believed, the most-shared links appearing in people’s News Feeds during the second quarter of 2021 covered topics like football, hemp, charitable donations, and recipes. You may want to hold off on placing too much trust in Facebook, though.
The social media company reportedly buried a similar report that it prepared for 2021’s first three months, according to a Friday story from the New York Times. You can probably guess at the reason: Facebook wasn’t thrilled with the results!
The earlier report, which the Times reviewed in full alongside internal executive emails that were shared with the paper, found that the most-viewed link of Q1 was an article whose headline ascribed the death of a Florida doctor to a COVID-19 vaccine. (All of the COVID vaccines currently approved for emergency use in the U.S. are “safe and effective,” the Centers for Disease Control has stated.)
The same report also revealed that the Facebook page for a far-right-aligned website that has peddled in conspiracy theories and misinformation was the 19th most popular page on the platform during the opening months of 2021. It’s enough to make a reasonable person think that maybe, just maybe, President Joe Biden had a point back in July.
The report was headed toward public release until executives stepped in with worries over the possibility of it creating a public relations nightmare. That group included Facebook chief marketing officer Alex Schultz; Facebook told the paper Schultz initially supported the report’s release but he eventually turned around and agreed with holding it back. The report was subsequently not released.
Facebook spokesperson Andy Stone had a whole entire quiet part to say out loud to the Times in response: “We considered making the report public earlier, but since we knew the attention it would garner, exactly as we saw this week, there were fixes to the system we wanted to make.”
There’s no further explanation in the story for Facebook’s thinking here. Mashable reached out with some follow-up questions, seeking clarification on exactly what “fixes” were needed, and what the company’s leaders think the decision to hold back an apparently unfavorable report says about Facebook’s stated efforts to be more transparent. The company hasn’t yet replied.
The Q2 report, meanwhile, has been widely panned for presenting what critics consider to be an inaccurate portrayal of what people are seeing on the platform. A report from The Washington Post characterized the Q2 report as being “part of a broader push by Facebook to block or discredit independent research about harmful content on its platform, offering its own carefully selected data and statistics instead.”
We saw one such situation unfold recently when Facebook moved to pull the plug on NYU’s Ad Observatory project, an effort that was conceived to monitor and analyze how politicians spend ad money on the platform. The NYU project employed a browser extension that the program’s participants, all of whom had to voluntarily opt in, could install to help gather data on ad placement.
Facebook threatened to shut the project down in the weeks ahead of the 2020 election in the U.S., and it eventually acted on those threats roughly halfway into 2021. The problem, as many critics pointed out, is Facebook leaned on flawed and easily undermined arguments to justify its decision.
“It’s like ExxonMobil releasing their own study on climate change,” a former Facebook employee told the Post in its story about the Q2 report. “It’s something to counter the independent research and media coverage that tells a different story.”
Facebook does indeed think very highly of this report, and how it reflects the company’s efforts to provide the world with an honest picture of what people are seeing most on the platform. As Guy Rosen, vice president of integrity, said in a statement provided for that Post story: “This is another step on a long journey we’ve undertaken to be, by far, the most transparent platform on the internet.”
That is a staggering sentence to read in the context of this new Times report, which makes it clear that Facebook moved to hide an earlier, more damning report from public view and then fix its processes, in ways that aren’t currently clear, to ensure that future reports didn’t end up in the same spot.
In one sense, it’s hardly shocking or unexpected for a company, public or private, to place its own interests ahead of anyone else’s. But at the same time, it’s difficult to swallow Facebook’s flowery statements about transparency and serving the public good when, as we’ve now learned, it’s very obviously doing exactly the opposite.