A new working paper from Stanford and NYU researchers suggests that Facebook’s efforts to get misinformation and fake news out of its users’ feeds seem to be working. The overall magnitude of the fake news problem may have lessened– at least temporarily – and Facebook’s efforts after 2016 election to control “diffusion of misinformation may have had a meaningful response,” says the paper.
Facebook having more success than Twitter in preventing misinformation
The researchers, who compared the spread of stories from legitimate sites to the spread of stories from fake sites, discovered that interactions have been comparatively stable over time. This further indicates that it’s the fake news sites that are being targeted and affected. Meanwhile, on Twitter, engagements with fake sites have continued to increase.
The researchers wrote that from the beginning of 2015 up to the 2016 election, fake news interactions rose steadily on both platforms, but following the election, Facebook engagements fell sharply, by almost 60%, while fake news shares on Twitter continued to increase.
The authors assembled a list of around 570 sites for the study which have been identified as news sites that publish false stories. The authors measured the engagements for several publishers, including niche business culture sites, small mainstream ones, and big mainstream ones, alongside the fake news sites.
They wrote: “Both platforms show a modest upward trend for major news and small news sites, and a modest downward trend for business and culture sites.” When it comes to trends in the ratio of Facebook engagements to Twitter shares, the ratios have been comparatively stable for small news, major news and business and culture sites but for the fake news sites, the ratio has declined sharply, from over “45:1 during the election to around 15:1 two years later.”
Quantity of fake news interactions still large in both Facebook and Twitter
The working paper found that the absolute quantity of fake news interactions on both social networking sites is still significant. In particular, Facebook, simply because it is so much bigger, has played a larger role in the diffusion of misinformation. The study stated that Facebook engagements declined from roughly 200 million per month at the end of 2016 to around 70 million per month at the end of the study’s sample period. This is an optimistic result and shows that the tech giant – contrary to what many think – is actually doing things to limit misinformation and fake news.
Fake news shares on Twitter have been in the 4-6 million per month range and 20 million per month for the major news sites since the end of 2016.
In sum, Facebook’s success in its efforts to reduce fake news shares have seemingly shown some results, but there still remains a long road ahead.
- Facebook Partnerships with Fact Checkers to Curb Fake News in Africa - December 10, 2018
- Facebook Partnerships with Fact Checkers to Curb the Spread of Fake News in Africa - November 27, 2018
- Twitter Investigated for Link Shortening - November 19, 2018
- Hackers Stole Around $60 Million in Cryptocurrency Exchange Hack - November 14, 2018
- Possible Success of Facebook’s Efforts to Limit Fake News - November 6, 2018
- Facebook’s AI Rosetta Scans Memes for Hate Speech - October 30, 2018
- France Announces Legal Framework for ICOs - October 23, 2018
- France Announces Legal Framework for ICOs - October 15, 2018
- The future of cryptocurrency in five years: A Billion Cryptocurrency Users? - October 9, 2018
- Dorsey Trying to Make Twitter More Transparent - October 2, 2018