Search
Close this search box.
The-panafrica-Final

South Africa moves to counter digital disinformation in African elections

1213

In the lead up to the May 2024 presidential elections in South Africa, the country’s Electoral Commission has partnered with social technology giants, including TikTok, Google and Meta, to tackle misinformation and disinformation as the country prepares for next month’s polls. Africa, along with most of the world, has begun to actively combat disinformation within the political sphere. Although the problems of disinformation seem endemic to digital platforms, the deep embeddedness connecting the people to the digital commons negates any escape from it as a viable solution to the disinformation problem.

A disinformation problem

The sentiment shared by 2017 Kenyan opposition leader Raila Odinga sums up the disinformation problem, as well as the domineering perspective that it is eroding democratic functions. Odinga says: “Hawa viongozi ni vifaranga vya kuku, vifaranga vya kompyuta, kompyuta ndio iliwataga, kompyuta iliangulia, kompyuta ikawatoa … vifaranga vya kompyuta”. This translates roughly as: “These leaders are chicks, chicks of computers, computers laid them as eggs, computers lay on the eggs, computers hatched them … chicks of computers … computer generated leaders”).

A 2018 study titled ‘How Africa Tweets’ showed that during the Kenyan elections held the previous year, bots accounted for a quarter of influential voices. Bots made up the highest share of influencers in both the initial presidential elections August, and the second ones in October. They accounted for 26% of all influential users who took part in Twitter conversations, with 25% for the August election and 28% for the October polls. This was in comparison to the less than 10% of politicians and political parties driving online electoral conversations and accounts run by journalists and news organisations which accounted for third of the influential accounts tweeting at the time.

In Nigeria’s 2019 elections similarly widespread use of social media saw the proliferation of misinformation campaigns, inflammatory content and fake news circulating on platforms such as X (formally known as Twitter), Facebook and WhatsApp. High- and low-profile examples of disinformation bloomed from this electoral period, notably including a widely circulated story about Nigeria’s President Muhammadu Buhari dying while undergoing medical treatment outside the country and being replaced by a Sudanese clone. Although it stands out as one of the more grandiose examples, a survey of over 1000 people in Abuja, Oyo and Kano during the election showed that many citizens were exposed to and often shared fabricated communications – especially on WhatsApp.

From South Africa’s African National Congress (ANC) running a R50 million “black ops room” to push out disinformation during the 2016 local elections we see Zimbabwe’s own challenges with disinformation and misinformation about the ruling ZANU-PF party and the opposing Citizens Coalition for Change (CCC). Misinformation has become endemic on social media platforms, and its proliferation has harmed the everyday functioning of democratic processes. It has begun to erode trust just when Africa is beginning to embrace the digital anthropocene.

Africa’s disinformation problem is largely driven by automated social media accounts or bots. Bots mimic human behaviour on social platforms, automatically producing messages, posting and interacting with other (human and automated) accounts through likes, comments and follows. The fabrication of information by human agents is another driver of disinformation, with such information mostly being shared through direct messaging platforms like WhatsApp. Bots and other fake news mechanisms serve primarily to agitate. They push negative narratives about major issues including race, migration and ethnicity, candidates for election and alleged electoral abnormalities. This serves to influence political behaviour by misleading and distorting the information available to citizens who ought to make rational and well-informed electoral choices.

Pathways to addressing Political Disinformation

Inquiries into the state of disinformation in Africa have questioned how disinformation has been characterised on the continent and what countries can do to create a healthier and transparent information landscape. I argue that in a world that is deeply embedded in socio-technological platforms, governments cannot work in isolation. Rather, they need first to involve tech companies in content regulation. Secondly, they need to improve their partnership with civil society on the margins to bolster digital literacy.

South Africa’s approach to combatting disinformation in the run-up to the upcoming elections alongside social media companies offers several opportunities for meaningful gains. Although the interests of tech companies and governments do not always converge, following the major Facebook-related Cambridge Analytica scandal, social media companies have become more incentivised by public pressure to improve transparency and data protection. A country like South Africa with almost 26 million social media users does well to tap into this incentive to promote their own needs. The campaign with Meta through the Voluntary Framework of Cooperation to Address Disinformation in the 2024 National and Provincial Elections in South Africa, for example, will see the establishment of a South Africa-specific Elections Operations Center. This centre will be dedicated to the identification of potential threats in real time and will work with local independent fact-checking organisations to review and rate content. These reviews include assessing whether content has been created or edited by tools like generative artificial intelligence. Debunked information will be accompanied by warning labels to guide readers and reduce its distribution by the algorithms that curate most of what users see on their feeds.

Although this has its psychological limitations, the strength of this approach is that it bypasses the biggest psychological barrier that increases our susceptibility to false information: detecting false information is difficult because when human brains encounter new information, they tend to focus on understanding it rather than evaluating it for accuracy. It therefore attacks disinformation in its home by pre-evaluating it for accuracy and allows social media users to have more context about the political information they are interacting with.

However, this approach also puts too much agency and responsibility in the hands of social media platforms and the tech giants that run and regulate them. Further, if one considers that for-profit managed social platforms may not always want to deter inflammatory discourses as they keep users engaged, then the approach may also invoke the old adage that the master’s tool cannot dismantle the master’s house but rather allows you to gain temporary concessions. What follows from here is also a need for those who are invested in democracy and its good function to invest in mass digital literacy initiatives. The internet, and social media platforms in particular, are a deep and expansive ocean that requires distinct diving tools. Africa’s 384 million strong social media users need critical digital literacy to evaluate political context, not just content with a label on it. Ultimately, three core competencies need to be achieved: knowledge about practical use of the internet, knowledge about how the information users come across is generated and distributed, and understanding of the internet’s potential for civil society.

Partnership with civil society on the margins can bolster digital literacy in communities that may be more vulnerable to falling prey to disinformation campaigns. South African Projects like the Lesedi Learning Bot which was developed for WhatsApp and which offers basic digital literacy micro-lessons to youth; the Kitso Learning Bot that teaches the fundamentals of online safety; the KALAAN initiative in the Ivory Coast and even my own experiences teaching digital literacy in the context of digital democracy and policy advocacy, have shown the significant impact that digital literacy can have on efforts to combat disinformation. Governments can partner with existing initiatives to give them a greater pool of resources to use and expand their reach. When strategically positioned to reach those who may not encounter digital education through conventional institutions, it can achieve grassroots sensitisation about the complex environment of the digital commons and empower citizens to actively evaluate political content.

As generative artificial intelligence tools become more sophisticated and as local and international contests for power continue, both approaches need to be wielded to address the proliferation of political disinformation and subsequent breakdown of trust. With at least 19 countries looking to head to the polls in 2024, social media companies and investment in digital literacy programmes may offer redress to our political disinformation pain.

Facebook
Twitter
Pinterest
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

Support The Pan African Review.

Your financial support ensures that the Pan-African Review initiative achieves sustainability and that its mission is shielded from manipulation. Most importantly, it allows us to bring high-quality content free of charge to those who may not be in a position to afford it.

You Might Also Like