Will WhatsApp’s Misinfo Cure Work for Facebook Messenger?

Will WhatsApp’s Misinfo Cure Work for Facebook Messenger?

On Thursday morning, Facebook announced several new policies to wrangle misinformation on its platforms ahead of the November election. Among them: limiting the number of people or groups you can forward a message to at one time on Messenger. For a glimpse of whether that might work—and how well—you needn’t look further than another Facebook-owned company: WhatsApp.


Restricting Messenger forwards is just one of several tools that Facebook has rolled out to combat misinfo, and it barely made an appearance in the company’s press release. But it’s also one of the only measures with an established track record, albeit an opaque one. More important, it’s one of the few steps Facebook can take without sparking accusations of political bias from either side.


In 2018, misinformation ran rampant on WhatsApp, and it was linked to deadly consequences in countries like India, where the messaging app is the de facto means of online communication. Because WhatsApp is end-to-end encrypted by default, the platform can’t know the contents of messages as they propagate throughout its ecosystem. But it could at least slow the spread. That July, WhatsApp reduced the number of accounts that you could forward a message to, from 256 to 20 for most people. In January 2019, it trimmed that number again, to 5.

That’s the playbook Facebook is emulating with Messenger, lopping the maximum number of forward recipients from 150 down to 5. “We've already implemented this in WhatsApp during sensitive periods,” Mark Zuckerberg wrote in a Facebook post outlining Thursday's changes, “and have found it to be an effective method ..

Support the originator by clicking the read the rest link below.