Whatsapp Channels, Used By Millions, Has No Clear Election Rules

A new platform launched by tech giant Meta and used by half a billion people has been running for more than eight months worldwide without any explicit election-disinformation policies — a lapse that lawmakers, disinformation experts and former Meta employees warn could pose grave risks in a year when nearly half the globe is casting major votes.
The platform, called WhatsApp Channels, transforms WhatsApp from a private messaging and group chat service into a broadcast platform as well, with users able to follow posts from public accounts. Although Meta has clear guardrails against voter interference, threats of violence and election misinformation on its other platforms Facebook, Instagram and Threads, it has not imposed the same rules on WhatsApp Channels, instead relying on broader community guidelines.
Meta’s lack of clear election disinformation policies on the new public platform stands in contrast to the broadcast messaging services it launched on Facebook, Messenger and Instagram last year which have such rules in place. Critics say Meta isn’t evenly applying its election integrity commitments as nearly 2 billion people vote in more than 60 countries this year.
“Meta’s continued failure to implement explicit election misinformation policies on its public WhatsApp Channels threatens the integrity of democratic processes in the United States and across the globe,” said Rep. Adam Schiff (D-Calif.), the former chair of the House Intelligence Committee, adding, “It is absolutely imperative that Meta extends the same policies from its other platforms to WhatsApp Channels to prevent the spread of election-related falsehoods.”
A WhatsApp spokesperson said Channels’ community guidelines — which prohibit illegal content, violent content, fraud or deceitful content — would apply to posts on voter suppression. However, the guidelines don’t specifically mention voting or elections.
The spokesperson said the company doesn’t consider the WhatsApp Channels platform to be a social media service, like Facebook or Instagram.
“This is a one-to-many broadcasting service where you can privately follow and get updates from who you want to follow. There's less of a social element than other platforms since followers can’t respond or comment, no one can see what Channels you follow or who else is following a certain Channel," the spokesperson said.
Former Meta employees disputed the argument.
“For me, the demarcation is the ability to spread a message quickly to a large group of people,” said Katie Harbath, who led a global team managing Meta’s election policies until 2021 and is currently the chief global trust officer for data technology firm Duco. “And the fact that it's not encrypted and is more public in nature — that brings with it different risks.”
Researchers contacted by POLITICO at Mozilla and University College London in late May said they hadn't found election misinformation on political candidates’ Channels — but say without clear guidelines, users could claim ignorance and misinformation could run rampant.
With few misinformation laws in place, and no federal regulations in the U.S., users rely on social media platforms’ enforcement of their policies to protect them from election falsehoods. Enforcement standards can vary across platforms, and even within a single company — as evident with WhatsApp.
More than 2 billion people in 180 countries use WhatsApp messaging, with the most users in Brazil and India. It has been a vector for the spread of misinformation in previous elections, particularly among non-English speakers. Fact checkers and journalists observed a wave of misinformation on it ahead of Zimbabwe’s 2023 elections. Spanish-language misinformation spread on WhatsApp messaging ahead ofthe 2020 U.S. elections.
The Channels feature rolled out in 150 countries last September, with users able to follow the posts of a wide range of public accounts. While WhatsApp messaging is encrypted, the new Channels feature is not. Anyone can create a Channels account to broadcast publicly, with no limits on numbers of followers.
This is similar to Dubai-based messaging service Telegram, which has been a key spreader of disinformation around elections in the U.S. and Germany, and in conflicts like Russia’s invasion of Ukraine and the Israel-Hamas war.
WhatsApp Channels has already attracted leading politicians. Indian Prime Minister Narendra Modi's WhatsApp Channel has some 13 million followers. His BJP party reportedly spread fake news via private WhatsApp groups ahead of previous elections. Divij Joshi, a researcher focused on technology, politics and law at University College London, told POLITICO he had not found election misinformation on Modi’s Channel during the Indian leader’s successful recent campaign for reelection.
WhatsApp says it addresses misinformation by allowing users to contact 50 fact-checking organizations to get election information and by limiting forwarding messages to five chats at once. The company said it can remove content and revoke accounts that violate its community guidelines.
But WhatsApp Channels' lack of election guardrails is in contrast with Meta’s other platforms: Facebook, Instagram and Threads have dedicated pages detailing policies against interference in elections, harm to election workers, threats of violence towards election workers or polling places, and election misinformation.
Tim Harper, an elections analyst at the Center for Democracy and Technology and a former Meta employee focused on election and political ad policies, said the lack of specific election protections on WhatsApp Channels “is a loophole that bad actors can and likely will exploit.”
“It is one of the largest online platforms in its own right,” Harper added. “Its policies should mirror Meta’s broader community standards, which have explicit policies to prevent election interference.”
Rep. Bennie Thompson (D-Miss.), the top Democrat on the House Homeland Security Committee, raised concerns about foreign influence operations spreading on WhatsApp Channels.
“I hope that Meta will ensure WhatsApp joins its other social media platforms in publishing and enforcing community standards that protect democratic institutions, especially the right to vote and have confidence in election results,” he told POLITICO in a statement.

Rep. Eric Swalwell (D-Calif.), the top Democrat on the House Homeland Security Committee’s cybersecurity panel, echoed his concerns in a statement: “Ideally, these protections will be projected upon WhatsApp.”
House Homeland Security Committee Chair Mark Green (R-Tenn.) referred POLITICO to House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-Wash.), who declined to comment on WhatsApp Channels. Rep. Andrew Garbarino (R-N.Y.), chair of the House Homeland Security subcommittee on cybersecurity, declined to comment.
Harbath, the former Meta employee, said it appears the company made a “conscious decision” not to extend its election misinformation policies to WhatsApp — which could impact upcoming U.S. elections given the reported double-digit growth of domestic WhatsApp users.
“They clearly made this choice, and there's going to be risks that come with not applying these policies, especially with WhatsApp becoming more popular in the United States,” she said.
Meta’s social media competitors — TikTok, X (formerly known as Twitter), and Google’s YouTube — all have policies prohibiting election misinformation and voter suppression.
Discord, a U.S.-based messaging platform with public channels that serves 170 million users, also has specific policies prohibiting election misinformation and voter suppression.
By contrast, other larger, overseas-based messaging services with broadcasting channels don’t have election misinformation policies. WeChat, the China-based messaging app with over 1 billion users, doesn’t have specific election rules, but has policies against fraud, illegal activities and misinformation that leads to harm. Telegram — with 500 million users — doesn’t have any specific election or misinformation guidelines, besides banning illegal content, but has been working to comply with the Digital Services Act in Europe.