Kenya. Meta is sued for 1.5 billion euros for fueling ethnic violence in Ethiopia

Meta must reform its business practices to ensure that Facebook’s algorithms stop amplifying hate and fueling ethnic strife, Amnesty International said on December 14, 2022, amid unprecedented legal action to be brought against Meta in the High Court of Kenya.

The distribution of dangerous content on Facebook is at the heart

Flavia Mwangovya, Deputy Director for East Africa, the Horn of Africa and the Great Lakes at Amnesty International

The suspects claim that Meta encouraged speech that resulted in ethnic violence and killings in Ethiopia, relying on an algorithm that prioritizes and recommends hateful and violent content on Facebook. The plaintiffs demand that Facebook’s algorithms stop recommending this type of content to users, and that Meta create a victims’ fund of up to 1.5 billion euros. The individual plaintiffs are represented by Mercy Mutemi of Nzili and Sumbi Advocates, with support from Foxglove, a nonprofit specializing in tech justice. Amnesty International joins six legal and human rights organizations as reducing parties in this case.

“The distribution of dangerous content on Facebook is central to Meta’s quest for profit, as its systems are designed to keep people connected on the platform. This legal action is an important step towards holding Meta accountable for its damaging business model,” said Flavia Mwagovya, Deputy Director for East Africa, Horn of Africa and Greater Lakes at Amnesty International.

One of Amnesty International’s staff in the region was targeted in the series of posts on the social media platform.

“In Ethiopia, people rely on social media to read news and information. Due to the hate and misinformation on Facebook, human rights defenders have also become targets of threats and hate campaigns. I have seen with my own eyes commenting on the dynamics on Facebook that have harmed my human rights work and I hope this case helps to redress the balance,” said Fisseha Tekle, Legal Adviser at Amnesty International.

Fisseha Tekle is one of the claimants taking the case to court, after facing a barrage of hate messages on Facebook for her work exposing human rights abuses in Ethiopia. An Ethiopian national, he now lives in Kenya, fearing for his life and not daring to return to Ethiopia to see his family because of the avalanche of vitriolic messages he is the target of on Facebook.

Due to the hate and misinformation on Facebook, human rights defenders have also become targets of threats and hate campaigns.

Fisseha Tekle, petitioner

deadly lakes

This lawsuit is also brought by Abraham Meareg, the son of Meareg Amare, a professor at Bahir Dar University in northern Ethiopia, who was hunted down and killed in November 2021, weeks after messages spread. are used on Facebook inciting hatred and violence against him. Facebook reportedly only removed the hateful posts eight days after the murder of Professor Meareg Amare, despite his family alerting the company more than three weeks before.

The court was told that Abraham Meareg fears for his safety and is seeking asylum in the United States. Her mother, who witnessed her husband’s murder and fled to Addis Ababa, is severely traumatized and screams every night in her sleep. The family’s home in Bahir Dar was seized by regional police.

The harmful posts targeting Meareg Amare and Fisseha Tekle were not isolated cases. The lawsuit ensures that Facebook is inundated with hateful, dangerous and inciting posts in the context of the Ethiopian conflict.

The meta uses engagement-based algorithm systems to power the newsfeed, ranking, recommendations, and groups features, shaping what is seen on the platform. Meta makes money when Facebook users stay on the platform as long as possible, by selling highly targeted ads.

Posting inflammatory content – ​​especially content that advocates hatred, incites violence, hostility and discrimination – is an effective way to keep people on the platform longer. Also the promotion and amplification of this type of content is essential to Facebook’s economic model based on surveillance.

According to internal studies dating back to 2012, Meta knew that its algorithms could lead to serious harm in the real world. In 2016, research from Meta clearly recognized that “our recommender systems make the problem thrive” of extremism.

In September 2022, Amnesty International analyzed how Meta’s algorithms proactively amplified and promoted content inciting violence, hatred and discrimination against Rohingya people in Myanmar and thereby heightened the risk of an explosion of mass violence.

“From Ethiopia to Myanmar, Meta knew or should have known that its algorithmic systems were fueling the spread of hateful content causing serious harm in the real world,” said Flavia Mwangovya.

“Meta has been unable to act to stem this tsunami of hate. Governments need to step up and implement effective legislation to curb business models based on the surveillance of tech giants. »

A deadly double standard

According to this lawsuit, there is an advancement in Meta’s approach to crisis situations in Africa compared to other regions of the world, particularly North America. The company has the ability to apply special tweaks to its algorithms to quickly remove inflammatory content in a crisis. However, despite being deployed elsewhere in the world, according to the petitioners, none of these adjustments were put in place during the conflict in Ethiopia, which allowed hateful content to continue to proliferate.

Meta has not deactivated invested in the moderation of content in the countries of the South

Flavia Mwangovya

Internal Meta documents leaked by whistleblower Frances Haugen, known as the ‘Facebook Papers’, showed the US$300 billion company did not have enough content moderators fluent in local languages. A report from Meta’s supervisory board also outlined concerns that the company had not invested sufficient resources in moderating content in languages ​​other than English.

“Meta has failed to invest in content moderation in the Global South, and therefore the spread of hate, violence and discrimination disproportionately affects the most marginalized and oppressed communities around the world, and particularly in the countries of the South. »

Leave a Comment