Can Facebook Be Held Responsible for Violence in Africa? Kenya’s Courts Say Yes

Ethiopian plaintiffs take on Meta in a landmark Kenyan lawsuit that could redefine Big Tech's accountability for fueling violence in Africa.

A smartphone displays the Facebook logo in front of a blurred screen showing the Meta logo, symbolizing Facebook’s parent company.

Meta logo displayed on a laptop screen and Facebook logo displayed on a phone screen

Photo by Jakub Porzycki/NurPhoto via Getty Images

On November 3, 2021, Professor Meareg Amare was killed outside his home in northern Ethiopia. In the weeks leading up to his murder, Facebook posts shared his photo, name, address, and false claims linking him to a rebel group. The content stayed online for weeks. His son, Abrham Meareg, believes that the delay cost his father's life. "I knew it would be a death sentence for my father," Abrham said.

Now, Abrham is one of the lead petitioners in a groundbreaking lawsuit against Meta, the parent company of Facebook. Filed in Kenya's High Court, the case accuses Meta of contributing to ethnic violence by allowing and even amplifying hate-filled content during Ethiopia's conflict from November 2020 to November 2022.

The court recently ruled that it has the authority to hear the case. This decision could have global implications for how people in some of the world's most vulnerable regions hold Big Tech companies accountable for harm caused by their platforms.

For the first time, an African court asserts its power to judge a multinational tech firm over human rights violations. The case challenges the idea that Africa is just a market for Big Tech, with no real power to demand accountability.


"[Facebook] is designed to amplify content which people will engage with a lot, and it doesn't matter what that content is," Alia Al Ghussain, researcher and adviser on Big Tech accountability at Amnesty International, tells OkayAfrica. "Quite often, that content will be harmful, potentially hateful, and potentially discriminatory. In conflict-affected settings like Ethiopia, there's a much higher risk that that content will go viral."

At the center of the lawsuit is Facebook's algorithm. The system prioritizes content that drives engagement, no matter what it is. That attention fuels Meta's advertising revenue. The petitioners argue that this design helped spread hate speech targeting the Tigrayan community during the war in northern Ethiopia.

The case seeks not only a ruling on Meta's role in fueling the violence but also calls for real change, including reforms to Facebook's algorithm and creating a victims' fund totaling 250 billion Kenyan shillings (~ $2 billion).

Despite multiple warnings about the risks in Ethiopia, Meta failed to act. Internal company documents dating back to 2012 show Meta knew its platform could lead to real-world violence. A 2016 internal report admitted that "our recommendation systems grow the problem" of extremism.

Alongside Abrham, Amnesty International's legal adviser Fisseha Tekle is also a petitioner. Tekle, a human rights defender, became a target of online hate on Facebook for his work documenting abuses in Ethiopia. Now living in Kenya, Tekle says he cannot return to Ethiopia for fear of his safety.

"In Ethiopia, the people rely on social media for news and information. Because of the hate and disinformation on Facebook, human rights defenders have also become targets of threats and vitriol. I saw first-hand how the dynamics on Facebook harmed my own human rights work and hope this case will redress the imbalance," Tekle says.


Together with Meareg, the case was filed with the Katiba Institute, which did not respond to several requests for comment from OkayAfrica.


Facebook's failures in Ethiopia mirror those in Myanmar, where Amnesty International found that Facebook played a significant role in inciting violence against the Rohingya. Al Ghussain says the company is repeating the same mistakes because its fundamental business model has not changed.

"When you prioritize engagement above everything else and your algorithms are designed this way, it's difficult to see how this won't repeat itself. This kind of risk is baked into the business model."

The case also highlights deep disparities in how Meta responds to crises depending on the location. While the company has deployed safety protocols in North America and Europe–an example is the protocols it placed around the January 6 riots in the US—it has not done the same in Africa.

Amnesty also raised concerns about whether Meta acted on alerts from its moderators covering Ethiopia. "Ethiopian moderators were raising concerns internally about what they saw on the platform," Al Ghussain says. "The company did not respond meaningfully to those warnings."

The legal action takes on even greater significance because it is happening in Kenya, where Facebook's content moderation for East Africa is based. With the court's ruling, the case will now be assigned to a panel of judges by Kenya's Chief Justice. Meta has asked for permission to appeal and continues to seek dismissal.

For advocates like Al Ghussain, the case signals something even more significant: "It could make justice much more accessible, not just for the litigants in this case, but potentially for other communities impacted by Meta's operations and potentially other companies."

10 Musicians From Côte d'Ivoire You Should Be Listening To
Latest

10 Musicians From Côte d'Ivoire You Should Be Listening To

Get into the highly-infectious Ivorian sounds of DJ Arafat, Shado Chris, Bebi Philip, Kiff No Beat and more.

Bob Marley on stage
Music

Bob Marley In Africa

A brief history of Bob Marley’s time in Africa.