It’s safe to say that Facebook’s latest effort to counter the idea that its platform helps to amplify misinformation and extremist ideas is not going exactly as planned.

As a recap – over the past year or so, New York Times journalist Kevin Roose has maintained this Twitter account, which lists the top ten best performing Facebook posts, based on total engagement (Likes, comments and shares), every day.

The data is sourced via CrowdTangle, Facebook’s own analytics platform, and as exemplified here, the daily listing is regularly dominated by right-wing commentators and partisan news outlets, which gives weight to the notion that Facebook plays a significant role in amplifying such content. Add to this the fact that some 70% of Americans now get news content from The Social Network, and it paints the picture that Facebook is a key source of biased misinformation, and likely societal division based on the same.

Facebook, of course, is not happy about this characterization, and last July, it sought to dispel the idea by publishing its own counter report, which matched the data from Roose’s top 10 listing alongside its own insights on the links that saw the most reach within the same time period.

Facebook posts by reach

Facebook’s argument is that while Roose’ list may indicate that the people who engage with specific topics are passionate, and are therefore more likely to comment and Like a post, that’s not representative of the most popular content on the platform, which it says is better indicated by the content that is seen by the widest breadth of its users.

As you can see in this example, the most seen posts listing – which includes all content that appears in someone’s News Feed, whether they engage with it or not – is more balanced, with light-hearted content and general interest stories.

The debate over the relative impact and influence of such has raged ever since, with Facebook struggling internally to mitigate the potential negative perceptions of Roose’s daily report, which has reportedly lead to the company making big changes to its CrowdTangle platform as it seeks to re-frame the data the app provides.

And then, last week, Facebook released another new, counter report, which once again focuses on the ‘Most Viewed’ content, which Facebook says it will now update quarterly to provide more transparency as to what, exactly, is gaining traction across The Social Network.

Facebook most viewed links Q2

The report is confusing, for various reasons. For one, several of these links in the ‘Most Viewed’ listing for Q2 2021 (above) are essentially spam, which probably highlights another negative element for the platform. But the decision to share this data quarterly also dilutes the impact of news stories – which gain traction over a day, as opposed to three months – while Facebook’s additional data on the most widely viewed domains is also fairly vague.

Facebook most viewed domains Q2

So lots of referral traffic to Twitter – but to what tweets? Lots of YouTube links, but no insights into the actual content shared. Essentially, the framing of the report seems designed to re-shape the idea around what gets shared on Facebook, but it doesn’t provide enough definitive insight to show that these more highly seen posts are in fact more influential.

But then, late on Saturday, another element was added to the story. In response to another report from the New York Times that Facebook actually canned an earlier version of its new ‘Most Viewed’ report because the data looked bad for the company, Facebook’s Andy Stone shared its scrapped Q1 Most Viewed report, which shows, among other things, that a report which fueled anti-vaccination theories was the most viewed link in the first three months of the year. 

The article in question, from The Chicago Tribune, is a report on how a doctor died just two weeks after getting the COVID-19 vaccine.

Chigao Tribune example

The first line of the updated report clarifies that there is no evidence linking the doctor’s death to the COVID vaccine. But still, you can imagine this headline would have helped fuel anti-vaxxers across The Social Network, and at 54 million views on Facebook, that’s a significant amount of vaccine hesitancy potentially fueled by The Social Network.

Which is why Facebook chose not to publish this initial Q1 update back in April, and instead waited till now to publish the more favorable Q2 Most Viewed report instead.

As explained by Stone:

“On the question of the unreleased report from earlier this year and why we held it. We ended up holding it because there were key fixes to the system we wanted to make. When you consider the slight differences between the original Q1 report we didn’t release versus the Q2 report we did release earlier this week – we are beginning to make some progress. Hopefully, everyone will see even more progress in Q3.”

Whether that’s actual ‘progress’ or a re-angling of the data to make it even more favorable to Facebook is hard to say, but it seems increasingly clear that this is a dedicated PR effort with a defined goal in mind, as opposed to a raw data document designed to increase transparency, and help people better understand what’s actually generating interest on The Social Network.

Among the various issues with Facebook’s ‘Most Viewed’ data approach:

  • As noted, by reporting the most viewed posts by quarter, Facebook is diluting the potential impacts of news reports, which are more likely to gain traction over a day, or a week, at a time.
  • Of course, Facebook could counter that by noting that it’s also showing the most popular domains – so if content from, say, Brietbart was consistently generating traffic, it would show up here. This is true, but the lack of specific insight into which specific URLs are being shared in the domain report dilutes this claim. It’s also worth noting that the majority of the top domains in the Q1 report are news outlets (9/20), which included Fox News, but that’s changed significantly in the Q2 report (5/10). Whether that’s a result of Facebook’s updated methodology, we don’t know.
  • Is a post more impactful, and influential, if a user sees it, or if they feel compelled to comment, Like or share it with their connections? I would argue the latter is a stronger indicator of engagement, and that would likely have a bigger influence over how people think. For example, if your friend shares an article and includes his/her own commentary about the vaccine causing dangerous side effects, that personal endorsement, based on your established relationship, is likely more impactful than you seeing that same post, without that friend’s comments, in your feed. In this sense, is ‘Most Viewed’ really a viable counter to actual engagement? 

Basically, Facebook’s Most Viewed report raises more questions than provides answers for the most part, and the insight value of the data is so clouded that it’s difficult to take a lot from it.

But, if you were to take Facebook’s Most Viewed insights as a real indicator of what’s popular on the platform, and what’s generating the most interest, here’s what it shows:

  • UNICEF posts appear in many Facebook user feeds, with 6 UNICEF posts listed in Facebook’s top 20 Most Viewed links listing for Q1 and 2 UNICEF posts shown in Facebook’s Q2 most viewed links report. But they don’t show up at all in Roose’s daily top 10 most engaged list. Why is that? Because Roose doesn’t include UNICEF posts, because their numbers are artificially inflated by their inclusion in Facebook’s COVID-19 info panel. So lots and lots of people are inadvertently shown UNICEF posts, because of the COVID info panel, but that doesn’t mean that anyone is actually clicking on them as a result. 
  • It seems likely that several other of the most viewed links had inflated view counts due to the COVID info panel, with reports on school closures in India, Medicines Sans Frontiers and online learning in the Philippines also seeing high exposure counts. If such posts are seeing increased views through Facebook’s info panels, which are essentially internal promotion surfaces, they should be excluded from Facebook’s most viewed listing.
  • Recipes are popular, with recipe sites taking up two of the most viewed link spots in Q1 and one in Q2
  • Both of the Most Viewed reports include one news article each on a missing child being found
  • Both reports include inspirational memes (2 in Q1, 1 in Q2)
  • ABC News and Yahoo.com appear to be key news sources, with their home pages appearing in both most viewed link reports
  • The Q2 Most Viewed links report includes a link to a hemp products store, one to a Christian street clothing store, a link to a website where you can buy Vietnam Veterans flags, a link to a Green Bay Packers alumni speakers bureau, and a link to an info page on the London Edge fashion show. Either these are incredibly popular on Facebook or people are spamming these links very heavily (evidence suggests the latter).
  • Right-wing news outlet The Epoch Times sees a lot of exposure on Facebook 

As you can see, in terms of content trends that might inform your Facebook approach, there’s not a lot to go on here, while the scope of links really just points to a lot of spam, which is not indicative of influence.

I mean, I struggle to believe that 37 million people on Facebook have been happy to see a link to this page appear in their feeds.

Facebook example page

But Facebook’s trying to say that these ‘most viewed’ pages indicate that it’s not all conspiracy theories and misinformation links, that the things that people actually see in the app are more benign pages like this.

So nothing to see here, no concerns – Facebook’s not amplifying dangerous movements.

Based on the examples provided, I don’t feel like Facebook has done much to dispel the contention of Roose’s listings – though maybe, going on reports, the back and forth here could end up being enough for Facebook to reformat the data it provides via CrowdTangle, which, if anything, would just reduce transparency, and ensure more questions remain.

Will Facebook go that way in the end? Who knows, but right now, its efforts to counter the pervading narrative are not having the desired effect.  

Original Source