Elections 2024: Navigating communication, transparency, trust in era of online propaganda

False news articles, virtually identical to authentic content, rapidly circulate, fostering mistrust

By
Khawaja Burhan Uddin

More than half of the world will exercise their right to vote in 2024, a year that more or less marks the advent of the most disruptive of technologies i.e., generative artificial intelligence (AI). And if it becomes a tool of manipulation in the hands of unscrupulous media and social media, the very foundation of free and fair elections will be shaken forever.

Throughout the next year, it is expected that more than 75 countries will hold 83 national elections, including Pakistan, India, Bangladesh, the EU, and the United States, according to the US-based think tank Atlantic Council.

With so many elections, it will be imperative that not only the politicos, but the social media platforms ensure that the integrity of the polls remains intact and does not lead to post-election violence or protests.

As the stakes are higher than ever, the fears of experts are justified when they say that as a major part of the world heads into polls, it will be hard for the global community to oversee the entire process — and for the platforms to guarantee that there isn’t manipulation.

In a panel discussion hosted by the Atlantic Council titled, “elections everywhere all at once”, activists and experts emphasised that the access to data and communication channels between civil society, the political representatives, and the platforms remains open and robust.

Technology and Human Rights Fellow at Carr Center for Human Rights Policy, Phumzile Van Damme, voiced fears that since there are so many elections, there will be distractions and the kind of advocacy and activism that the global community plays won’t be there.

2024 is a culmination of trends that are designed to erode trust in institutions — Spencer

Brazilian journalist Patrícia Campos Mello, who is a reporter at large and columnist at Folha de São Paulo, said that electoral denialism would spread around — just like it did in January 2023 in Brazil and the capitol riots in the US in 2021.

Kay Spencer, program director of elections at the Washington-based National Democratic Institute (NDI), said that the year 2024 is a big year for social media platforms and it will be important to monitor how they will manage all the elections that are coming up.

“2024 is a culmination of trends that are designed to erode trust in institutions and I think what’s key here is to keep an eye on the smaller elections as well as the big ones. What we have seen in the past is that we know authoritarians use these smaller elections as a testing ground for tactics to use in bigger elections later on,” she noted.

New battlegrounds

Mello said the techniques of electioneering are changing as well as the landscape.

Giving the example of Brazil, she said in the 2018 polls, there were “lots” of WhatsApp groups spreading misinformation. Then in the 2022 presidential elections, TikTok and Telegram became “big actors”.

However, she said it was important to note that not only the battlegrounds were shifting from one platform to another, but the content moderation policies of these companies were also being upgraded over time. But, she said, the policies were not implemented across the board.

“In Brazil, we can notice that they are not as strict as the US,” she said, emphasising the need for enforcement of regulations in other nations as well.

“Because many countries, like Brazil, don’t have regulation laws, so it’s very dangerous to rely on courts. It’s very dangerous what one judge is going to decide and just try to remove content,” she said.

Civil society’s role

In light of the difficulties that political actors and people face during these elections due to misinformation, Spencer said civil society, especially during polls, needs timely and affordable access to platforms’ API to be able to monitor the information environment.

“For decades, civil society has been able to build trust in institutions and credible elections by conducting observation missions that help them accurately assess the process of an election,” she explained.

Spencer maintained that civil society can independently verify the process and official results of an election, and in doing that, they can help instil confidence in election management bodies.

“These practices have been going on for decades, they can help detect manipulation and deter it, and they can help build confidence in elections by posting credible process and verification results,” the NDI official said.

Spencer also noted that observation missions have the potential to reduce post-election violence by providing independent evidence that supports or refutes the claims that official results have been manipulated.

She shared that in 1990, in Bulgaria, in the first post-Communist elections, the incumbent party won despite the opposition being sure of a victory. This ultimately led to protests on the streets.

I would really encourage the platforms to have an open dialogue with political candidates  — Van Damme

However, independent observers confirmed that the opposition party lost and that they weren’t cheated.

“With that news, the protesters returned. I think it’s important for platforms to understand that it is a way to protect election integrity,” she said, noting that the increased surveillance of election observers is also causing problems.

Van Damme said as an activist, she found that the relationship between civil society, policymakers, and the platforms was very acrimonious.

“It’s one where there’s distrust on both sides and where there isn’t a conversation about what the challenges are for platforms and the political candidates. So I would really encourage the platforms to have an open dialogue with political candidates about what their challenges are.”

Online attacks

The panellists agreed that due to the online attacks, women especially remain on the sidelines and decide against joining politics. They mentioned that the younger generation did not want to continue the rhetoric of old ways of politics.

Van Damme, a former lawmaker and a researcher, said she was first elected to the South African parliament in 2014 when “social media was big, but not as big”.

“In 2019, I experienced how difficult it is to be a candidate online, to be particularly a woman,” the former parliamentarian said, noting that politicians have to contend with the barrage of hate, rape threats, and deep fakes.

Many women are not participating in politics because of online attacks — Mello

“I think beyond just the mental health consequences it has for candidates, I think it’s kind of dissuading a lot of young people, women, from wanting to be in politics, and we need more young people, more young women,” she said, calling on the platforms to do more.

Mello said X, formerly Twitter, “totally excludes and intimidates several voices online” as was witnessed in the US, Brazil, India, and the Philippines.

...they were particularly bad [...] in terms of targeting women politicians, journalists, and activists,” she said, terming them one of the “most toxic environments in an election”.

The journalist said although there were some hotlines set up, they weren’t enough to stop the attacks against women.

“Many, many women are not participating in civil lives because of this.”

AI — the new playground

The panellists stressed that people should not “demonise” AI, noting that the problems remain the same. It’s the same political actors using tools at their disposal to spread misinformation.

“We shouldn’t demonise it [AI] like it’s the only problem. We need regulations to hold social media platforms accountable,” Van Damme said.

She added that it’s more about acknowledging that old problems still exist and we should let people know that AI is also a kind of disinformation.

AI-generated audio would be notoriously difficult [to debunk]. On social media, you can see images and point out [that they are fake], but when it comes to audio it’s difficult,” the ex-parliamentarian said.

Mello agreed that there’s a tendency to demonise AI.

“If you think about it, the main issue might be political actors who are weaponising social media,” she said. “If they aren’t using generative AI for this, [they will use something else and] it’s still the same problem.”

What can platforms do?

Van Damme, moving the conversation forward, added that a lot of civil society organisations struggle to communicate the problem because of the inaccessibility of data. She added that Elon Musk-headed X is more restrictive and researchers don’t have the accurate information to be able to quantify the issue.

“That’s certainly something that needs to be addressed,” she said.

Spencer noted that platforms should make sure that their trusted partner network is clearly established — the people you’re speaking with really know how to reach you under any circumstances.

Mello said one of the things that might be important is to keep the communication channels between electoral authorities and platforms clear.

“I mean this is not trying to exert pressure or anything,” she clarified.

“It’s like: we have an emergency, there’s this deep fake video going viral or something. We need to have a communication channel so we have a rapid response,” she said.


— Khawaja Burhan Uddin is a staffer at Geo.tv