Ethiopians sue Meta for not adequately moderating the content that amplified the violence that left more than half a million dead during the Tigray war they have been given the go-ahead to serve the social media giant out of Kenya. this is the last one case which seeks to force Facebook to stop amplifying violent, hateful and inflammatory posts.
A Kenyan court on Thursday cleared the petitioners to serve Meta in California, US, after they failed to trace the social media giant’s local office. It turned out that while Meta does have business operations in Kenya, it does not have a physical office, as its local employees work remotely.
The decision sets the stage for the initiation of a lawsuit filed in December last year by the Kenyan human rights group Katiba Institute and Ethiopian researchers Fisseha Tekle and Abrham Meareg. Meareg’s father, Professor Meareg Amare Abrha, was killed during the Tigray war after Facebook posts criticized and called for violence against him, the lawsuit alleges.
The petitioners seek to force Meta to stop viral hate on Facebook, increase content review at the moderation center in Kenya, and create a $1.6 billion compensation fund.
The petitioners allege that Facebook’s algorithm amplified hate and incitement posts which drew more interactions and kept users engaged longer.
They claim Facebook “did not invest enough” in reviewing human content in central Kenya, risking lives by ignoring, rejecting, or being slow to remove posts that also violated their community standards.
Meareg said her family has firsthand experience of how flawed content moderation could endanger lives and divide families.
Claims his father was killed after Meta failed to comply with repeated requests to remove publications directed at him and other Tigrayans, as calls for the massacre against the ethnic group spread online and offline. The two-year Tigray War broke out in November 2020 after the Ethiopian army clashed with Tigray forces. leaving 600,000 dead.
“My father was killed because Facebook posts identified him, falsely accused him, leaked the address of where he lives and called for his death,” said Meareg, a former doctoral student, adding that he was forced to flee the country and seek asylum. in the United States after the death of his father.
“My father’s case is not isolated. At the time of the posts and his death, Facebook was saturated with hateful, inflammatory, and dangerous posts…many other tragedies like ours have happened,” he said.
Meta declined to comment.
Meareg says she reported the posts she found, but her reports were rejected or ignored. He claims to have reported several posts in 2021, including one containing dead bodies, and some of those posts were still on the social site when he went to court last December.
He criticized Facebook’s content review, saying the center in Kenya had only 25 moderators responsible for Amharic, Tigrinya and Oromo content, leaving out 82 other languages with no staff to moderate.
Meta previously told TechCrunch that it employed equipment and technology to help it take down hate speech and incitement, and that it had partners and staff with local knowledge to help it develop methods to detect infringing content.
“A flaw within Facebook has been allowed to grow, weaponizing it to spread hate, violence and even genocide,” said Martha Dark, director of Foxglove, a tech justice NGO supporting the case. “Meta could take real steps today to turn off the hate spread on Facebook.”
This is not the first time Meta has been accused of fomenting violence in Ethiopia. complainant Frances Haugen previously accused him of “literally fomenting ethnic violence” in Ethiopia, and a Global Witness Investigation it also noted that Facebook did not detect hate speech in the primary language of Ethiopia.
Currently, social media platforms, including Facebook, have been blocked in Ethiopia since early February after state plans to split the Ethiopian Orthodox Tewhado Church sparked anti-government protests.
Adding to Meta’s woes in Kenya
Meta faces three lawsuits in Kenya.
The company and its sub-Saharan African content review partner, Sama, were sued in Kenya last May for exploitation and destruction of unions By Daniel Motaung, former content moderator.
Motaung claimed to have been fired by Sama for organizing a 2019 strike that sought to unionize Sama employees. He was suing Meta and Sama for forced labor, exploitation, human trafficking, unfair labor relations, union breaking, and failure to provide “proper” mental health and psychosocial support.
Meta sought to have his name removed from the suit, saying Motaung was not his employee and the Kenyan court had no jurisdiction over him. However, failed to stop the lawsuit after the court ruled that it had a case to answer, since some aspects of how the company operates in the country make it responsible. The social media giant has appealed the court’s decision.
Earlier this month, Meta was sued along with Sama and another content review partner, Majorel, for 183 content moderators who claimed they were illegally fired and blacklisted. The moderators claimed that Sama illegally fired them after he lowered his content review armand that Meta instructed its new partner based in Luxembourg, Majorel, to the blacklist of ex-Sama content moderators.
Meta also sought to be crossed out of this case, but last week, Kenyan court said it had jurisdiction over employer-employee disputes and “issues of alleged illegal and unfair termination of the employment contract due to dismissal” and that he had the power “to enforce the alleged violation of human rights and fundamental freedoms” by Meta, Sama and Majorel.
—————————————————-
Source link