Social media giant Meta has been accused of stoking racial tensions in the lead up to the ongoing civil conflict in the Ethiopian region of Tigray.
Ethiopia, like other Global South nations, has limited institutional structures in place to safeguard internet safety, and this along with a culture of lax journalism has lead to an environment rife with misinformation.
In the place of typical media structures, Meta sites like Facebook are easily able to assert themselves as a primary source of information to citizens.
Speaking to a United States senate subcommittee last year, a former data scientist with Meta, Frances Haugen warned that engagement driven algorithms can have a severely harmful impact on news media landscapes.
Haugen said that Meta was ‘literally fanning ethnic violence’ in Ethiopia.
‘My fear is that without action, divisive and extremist behaviours we see today are only the beginning,’ Haugen said.
‘What we saw in Myanmar and are now seeing in Ethiopia are only the beginning chapters of a story so terrifying no one wants to read the end of it.’
Speaking to NPR last year, Ethiopian journalist Zecharias Zelalem said that the documentation of events in Tigray was nearly impossible, with almost all information online being unreliable.
‘Just looking at the instances of documented evidence over the course of the past three years in which prominent Facebook posters would post unverified, often inflammatory posts or rhetoric that would then go on to incite mob violence, ethnic clashes, crackdowns on independent press or outspoken voices,’ Zelalem said to NPR.
The Facebook engagement algorithm prioritises engagement with posts, which means posts with more comments and interaction are promoted in other users’ newsfeed. This doesn’t however equate to accuracy or credibility.
Will Oremus wrote in the Washington Post that companies and organisations are actively competing for Facebook engagement.
‘The top post on a Facebook user’s news feed, shown as the biggest box, is a prized position based on thousands of data points related to the user and post itself, such as the poster, reactions and comments,’ Oremus wrote.
‘The details of its design determine what sorts of content thrive on the world’s largest social network, and what types languish — which in turn shapes the posts we all create, and the ways we interact on its platform.’
‘The algorithm is precisely tailored to each user but also reflects Facebook’s strategy to favour certain content or behaviour.’
Since 2018, Meta’s algorithm has prioritised the elevation of posts encouraging interaction, such as posts popular with friends. Friends, family, and viral memes are the intended beneficiaries of this move, but divisive interactions end up being promoted.
In the case of Ethiopia, and increasingly other Global South nations such as Myanmar, this engagement algorithm has led to the mass propagation of conspiracy theories and racist vitriol.
A study by the Harvard Kennedy School in 2021 found there were two predominant campaigns attempting to shape the narrative of the Tigray Conflict. The study noted that the case in Ethiopia is complex, and intersects with hate speech, trauma, and misinformation, as well as manipulation and propaganda.
One example of this war of narrative is the Facebook page Ethiopia Current Issues Fact Check, an account that does not promote independent fact-checking but rather publishes pro-government content.
Government loyalists also attacked an Amnesty International report from 2021 detailing the killing of Tigrayan civilians in Axum. Eritrean soldiers were at the time alleged to have entered the country in 2020 to carry out the killings.
After initially denying the report, the Ethiopian government confirmed Eritrean troops had entered the country and carried out the attack.
The conflict in Tigray continues to lead to civilian casualties today, although it has now largely become a guerilla war – the long term consequences of which remain to be seen.
Support for the war continues in Ethiopia as the result of minimal competing information.