In the lead-up to the 2024 elections, social media platforms like Meta, YouTube, and TikTok have prioritized measures to protect election integrity. However, concerns remain regarding WhatsApp, a messaging app with a significant global reach that rivals public social media platforms.
Meta, which acquired WhatsApp in 2014, has predominantly focused its election-related safety efforts on Facebook. A Mozilla analysis revealed that Facebook had made 95 policy announcements concerning elections since 2016, while WhatsApp made only 14. In contrast, Google and YouTube made 35 and 27 announcements, respectively.
Mozilla researchers urge Meta to make substantial changes to WhatsApp’s functionality during election periods. They propose adding disinformation labels to viral content, limiting broadcast and Communities features, and encouraging users to pause and reflect before forwarding messages.
In India, the platform’s susceptibility to misinformation has been underscored by a series of lynchings sparked by viral content. WhatsApp has introduced measures to curb this, including limiting forwarding and labeling forwarded messages. However, researchers argue that these measures may have unintended consequences, leading users to perceive highly forwarded content as more credible.
Mozilla’s demands stem from research conducted in Brazil, India, and Liberia, where WhatsApp’s broadcast feature was heavily utilized by political parties for targeted propaganda and, in some cases, hate speech. The platform’s encrypted nature presents challenges for monitoring content circulation, although some researchers have managed to gather data from select WhatsApp groups.
Encryption, according to Mozilla, should not serve as an excuse for evading accountability. The organization stresses that the primary concern lies in the ease with which small groups can significantly influence larger groups, potentially disrupting the electoral process.