During the US election, posts containing disinformation interacted about six times more often than factual news stories. Facebook disagrees.
The forthcoming study was conducted by researchers from New York University and France’s Université Grenoble Alpes and was available for review by the Washington Post. Peer review has already taken place.
The researchers compared posts between August 2020 and January 2021 from trusted sources such as CNN or the WHO, with a 2,551 page list from organizations known for regularly spreading misinformation.
This shows that the latter received six times more likes, shares, and interactions on Facebook than the other category posts. Thus, the study partly confirms previous claims and studies that posts with disinformation are often “rewarded” with increased visibility on Facebook.
The researchers do note that Facebook does not seem to have a political preference in terms of more interaction. Both left- and right-oriented posts with disinformation received more engagement than factual news.
But right-wing pages on Facebook do publish misleading information more often than the rest of the political spectrum, they note. This, too, was previously established in other research, including about the midterm elections in 2018.
Facebook itself tells the Washington Post that it actively fights fake news, including by working with fact-checkers worldwide. Although a message is only labelled as disinformation a few days or weeks later.
The platform also nuances that like or comments are not equal to how many people read the message. “If you look at the content that gets the most reach on Facebook, that’s not what this study suggests.” A company spokesperson told the newspaper.
We have to nuance that Facebook does not limit disinformation from politicians, but also that the company itself is very scant with data about how many people click through on messages and filters the limited data about it firmly.
For example, the company recently released a transparency report, but it turned out that it had stopped an earlier report itself at the last minute. That report, released later, includes a top twenty most viewed links, but because it covers a total of three months, that says nothing about posts that are very popular for a short time. So, for example, the top twenty accounts for only 0.057 percent of all content on Facebook, a fraction of the total.
Facebook itself is very frugal about sharing data with researchers. For example, it even blocked scientists who were researching ads on Facebook. Something it also did before for a European project. So, on the one hand, Facebook says that the study’s claims do not correspond to reality. But at the same time, it does everything it can to give researchers as little access to research data as possible.