Most users in the United States use Facebook to post pictures of friends and family, advertise for parties and events and sell second-hand items.
But in other parts of the world, the platform is fueling hate and violence — according to a new lawsuit filed in Kenya on Wednesday.
The suit comes amid growing criticism that Facebook and other social media giants are not doing enough to stop hate speech and inciting language from spreading online across Africa.
At the center of this recent lawsuit is Abrham Meareg, an Ethiopian researcher who said his father was killed last year after being targeted on social media during the Tigray conflict.
“I hold Facebook directly responsible for my father’s murder,” Meareg told The World from the United States, where he is currently seeking asylum.
His father, professor Meareg Amare Abrha, was an accomplished academic who worked at the University of Bahir Dar in northern Ethiopia.
The professor had dedicated a lifetime of research in the field of chemistry, and mentored generations of Ethiopian students.
“He was one of the most respected persons in the region, in the country,” said Meareg, who added that his father was the type of person who never failed to visit his own parents back in their village, or go out of his way to take care of sick relatives.
“He was a family man,” Meareg added. As an elder, the professor kept busy by writing articles and papers and occassionally viewing social media.
But his son Meareg knew of the power — and dangers — of the platforms.
Over the nearly two-year war in Ethiopia, he had seen Facebook become a channel for graphic photos of war victims, ethnic hate and genocidal language.
“I almost reported nearly 45 to 50 posts,” he said, but recalls only one or two posts being removed.
Then in October 2021, Meareg saw Facebook posts that targeted his father — spreading ethnically charged content and inciting lies about the professor, an ethnic Tigrayan who was living in the Amhara region of the country.
Another post identified the neighborhood where the professor lived, and people called for his murder in the comments.
Despite Meareg’s numerous efforts to report the posts to Facebook, they remained online.
Within weeks, he said his father was attacked by a group of men and gunned down outside of his home in northern Ethiopia.
The killers left him to die in the street and, according to Meareg’s affidavit, they taunted his father with similar language from the posts on Facebook.
A week after his father’s murder, Meareg said he received a notice from Facebook that one of the posts violated the company’s community standards and would be taken down.
But he said, as of early December — more than a year later — one of the posts remained online.
“The Facebook posts … it was considered a death sentence for our father,” he said.
“That’s why I’m taking Facebook to court.”
In the lawsuit, Meareg is joined by another Ethiopian petitioner, Fisseha Tekle of Amnesty International, who has been the target of online vitriol and hate speech that he said constitute a threat to his life.
“Both of my clients cannot go back to their country as it is right now. All arising from Facebook posts,” said Mercy Mutemi, their lawyer.
“What kind of a world are we living in when Facebook becomes the reason why we cannot enjoy our rights? That is the premise of the case we have made,” she continued, adding that Kenya is a suitable place for the lawsuit because it is the location of Facebook’s content moderation for the region.
At the heart of the problem, Mutemi argues, is Facebook’s algorithm — which the lawsuit argues promotes and enhances inciteful and hateful content.
They also criticize Facebook’s investment in content moderation in the region as insufficient.
“We are not second-rate users. Africans deserve better.”
“The algorithm has to be fixed and Facebook has to seriously invest in content moderation,” Mutemi said. “We are not second-rate users. Africans deserve better.”
By choosing not to invest more in content moderation, Mutemi argues, Facebook is discriminating against African users.
The lawsuit, which is also joined by the Katiba Institute in Kenya, is calling for Facebook to make fundamental changes, and is asking for more than $1.6 billion to create a fund for victims.
“If you have been a victim of human rights violations leading [to] or stemming from either the algorithmic choices or the content moderation choices, you can come to court, prove damage and then you can be awarded damages from that fund.”
In a statement sent over email to The World, Facebook said they do “invest heavily in teams and technology to help us find and remove this content.”
This includes hiring “staff with local knowledge and expertise.”
However, they did not confirm how many content moderators are employed in the region, or respond to specific questions regarding the case.
“I feel like Facebook doesn’t care about our lives,” Meareg said.
To date, his father’s body remains buried in an unmarked grave, as he and his remaining family members are unable to safely return home.
Correction: The original radio version of this story incorrectly stated that Facebook has a regional office in Nairobi, Kenya. While there is no physical office, at the time of airing, the platform had a presence in the city through its outsourced content moderators.
At The World, we believe strongly that human-centered journalism is at the heart of an informed public and a strong democracy. We see democracy and journalism as two sides of the same coin. If you care about one, it is imperative to care about the other.
Every day, our nonprofit newsroom seeks to inform and empower listeners and hold the powerful accountable. Neither would be possible without the support of listeners like you. If you believe in our work, will you give today? We need your help now more than ever!