Ethiopia: Facebook Algorithms Contributed to Human Rights Abuses Against Tigrayans During Conflict – New Report

Facebook owner Meta has contributed to serious human rights abuses against Ethiopia’s Tigrayan community, Amnesty International said in a new report published today (31 October).

The 64-page report, “A death sentence for my father: Meta’s contribution to human rights abuses in northern Ethiopia”, shows how the tech giant failed to curb the spread of content advocating hatred and violence, targeting Tigrayans in northern Ethiopia during the two-year-long armed conflict.

Amnesty has previously highlighted how Meta contributed to human rights violations against the Rohingya in Myanmar, warning of the risk of such abuses being replicated elsewhere if Meta’s business model and content-shaping algorithms were not fundamentally reformed.

Facebook is a major source of information for many Ethiopians and is considered a trustworthy news source. However, Facebook’s algorithms have fuelled devastating human rights violations by amplifying harmful content targeting the Tigrayan community across Facebook during the armed conflict.

Amnesty’s research has established that Facebook’s algorithmic systems supercharged the spread of harmful rhetoric targeting the Tigrayan community, while the platform’s content moderation systems failed to detect and respond appropriately to such content.

These failures ultimately contributed to the killing of Tigrayan university chemistry professor Meareg Amare. Professor Amare was killed by a group of men after messages targeting him were posted on Facebook on 3 November 2021.The posts contained his name, photo, place of work and house address, and claimed he was a supporter of the Tigrayan People’s Liberation Front, while also accusing him of stealing large sums of money. These allegations were denied by his family. His son, Abrham Meareg, believes that these hostile Facebook posts contributed to his father’s death, saying “I knew it would be a death sentence for my father”.

Meta’s content-shaping algorithms are designed to maximise user engagement for the purpose of serving targeted ads, with the result that they boost inflammatory, harmful and divisive content, which tends to attract the most attention from users. In 2018, Meta reconfigured its Facebook news feed algorithm around a new metric called “MSI” or “Meaningful Social Interactions”, in a supposed attempt to “fix Facebook”. However, Amnesty’s analysis of evidence from the Facebook Papers – the internal Meta documents disclosed by whistleblower Frances Haugen in 2021 – shows that the algorithms remained hard-wired for maximum engagement, therefore disproportionately favouring inflammatory content, including advocacy of hatred.

In a document from the Facebook Papers, there is evidence suggesting that Meta’s CEO Mark Zuckerberg personally intervened to stop mitigation measures being applied in high-risk countries like Ethiopia, because the measures may have interfered with the MSI metric. As the fundamentals of Meta’s engagement-centric business model have not changed, the company continues to present a significant and ongoing danger to human rights, particularly in conflict-affected settings.

Meta has disputed Amnesty’s findings, and the company’s response is reflected in the report.

Agnès Callamard, Amnesty International’s Secretary General, said:

“Three years after its staggering failures in Myanmar, Meta has once again – through its content-shaping algorithms and data-hungry business model – contributed to serious human rights abuses. “Even before the outbreak of the conflict in northern Ethiopia, civil society organisations and human rights experts repeatedly warned that Meta risked contributing to violence in the country and pleaded with the company to take meaningful action.”However, Meta ignored these warnings and did not take appropriate measures – even after the conflict had broken out. As a result, Meta has again contributed to serious human rights abuses, this time perpetrated against the Tigrayan community.”The mass dissemination of certain posts incited violence and discrimination targeting the Tigrayan community, pouring fuel on what was already an inflamed situation with significant ethnic tensions.”

Meta: knowingly failing its users

Internal Meta documents reviewed by Amnesty reveal that Meta knew of the inadequacies of its mitigation measures in Ethiopia and the risks this presented in a country that the company itself considered to be at a high risk of violence. One internal document from 2020 warned that “current mitigation strategies are not enough” to stop the spread of harmful content on the Facebook platform in Ethiopia.

Alongside amplifying harmful content, Meta’s poor response time and refusal to take down reported content caused multiple people interviewed by Amnesty to feel that there was no point in reporting content to the company.

Meta received multiple warnings both before and during the conflict from civil society organisations, human rights experts and its own Facebook Oversight Board – which recommended Meta undertake an independent human rights impact assessment on Ethiopia in 2021. *Gelila, a member of Ethiopian civil society, was part of Meta’s “Trusted Partner” programme – an initiative that aims to provide selected civil society groups with a designated channel to alert Meta of harmful content.

She explained that Facebook’s failure to act on alerts made the human rights situation in the country worse:

“As someone who has been in Ethiopia for a long time, I can say that Facebook is making communities more vulnerable to conflict with each other. They are extremely slow in reacting to things. They are not sensitive to what is said – I think they have standards which are very far from what is happening on the ground.”

Calls on Meta