Human Rights Groups Call on Tech Platforms To Reform Practices in Crisis Zones

Justin Hendrix / Apr 14, 2022
Irpin, Ukraine - 5 March 2022: Ukrainian soldier stands on the check point to the city Irpin near Kyiv during the evacuation of local people under the shelling of the Russian troops. Kutsenko Volodymyr / Shutterstock

Long before Russia escalated its violent invasion of Ukraine in February of this year, social media platforms played host to Russian disinformation while regularly deleting evidence of human rights abuses in Crimea, Donetsk and Luhansk. Now, human rights groups are coming together to demand more from the technology companies, both in Ukraine and more broadly.

In a letter issued on Tuesday, 13 Ukrainian civil society and human rights groups wrote to Meta-- the company that operates Facebook, Instagram and WhatsApp-- to express concern "about the inadequate application of the Meta community standards to the posts containing evidence of the ongoing violations of international criminal law, international humanitarian law and international human rights law in the context of Russian aggression against Ukraine and the respective consequences of such violations."

"First and foremost, platforms have to clarify their policies concerning international armed conflicts (IAC) or, at least, content depicting alleged international crimes and gross human rights violations," said Maksym Dvorovyi, Legal Counsel at Digital Security Lab Ukraine, which hosted the letter. "Afterwards, platforms should establish dedicated teams of people specifically attributed to dealing with requests on IAC content."

The Ukrainian groups-- which include organizations such as Digital Security Lab Ukraine as well as the Center for Civil Liberties and Ukrainian Helsinki Human Rights Union-- are concerned about Meta's willingness to permit imagery and posts glorifying Russian aggression on its platforms, while it removes "posts containing evidence of international crimes and mass-scale human rights violations." The documentation of potential war crimes in Ukraine is a major undertaking for many of the signatories.

"Photos and videos of deliberate attacks by Russian troops on civilian objects and civilians contain important evidence of war crimes," said Oleksandra Matviychuk, a Kyiv-based human rights lawyer and head of the Center for Civil Liberties, a signatory of the letter. "Moreover, this information refutes the narratives of Russian propaganda and serves an important public interest. Amendments to Meta's community standards are needed."

The authors seek a range of policy and community standards reforms, as well as "enhanced cooperation with the local civil society and media organizations possessing necessary expertise," more fidelity to international human rights law and standards, and the preservation of information that may be valuable in the prosecution of war crimes and other human rights violations. Dvorovyi says he hopes to see reforms that will bring more "proper contextual analysis" to the content moderation process, with transparency measures to "boost trust in the platforms locally."

Likewise, on Wednesday an international group of more than 30 tech and human rights groups, drawing a line from events in Syria and Myanmar to the current war in Ukraine, joined together to issue a letter to the platforms to "call for long term investment in human rights, accountability, and a transparent, equal and consistent application of policies to uphold the rights of users online."

"The letter is addressed to Meta, Telegram, Tik Tok, Google, and Twitter because each of them have made important decisions about content moderation in crisis situations, in Ukraine and beyond," said Dia Kayyali, one of the organizers of the letter and an associate director for advocacy at Mnemonic, the organization that operates the Syrian Archive, Yemeni Archive and Sudanese Archive of war crimes evidence, and which has provided assistance to groups documenting the war in Ukraine.

According to Kayyali, the platforms have demonstrated very different levels of engagement with the issues- and Meta may not be the worst offender.

"Despite there being a lot of room for improvement, Meta and Twitter have made significant efforts in these areas," said Kayyali. "Conversely, Google has deeply underinvested in civil society engagement and due diligence, especially considering what a huge company it is, and Telegram is basically unreachable. And as a newer company TikTok is in the process of figuring out its engagement with civil society right now, so there's lots of room to push for it to do better."

The international groups, which include such organizations as the Electronic Frontier Foundation, Access Now, the Center for Democracy & Technology, WITNESS, Zašto ne, 7amleh and Derechos Digitales, seek reforms in seven areas, including "real human rights due diligence," "equitable investment," "meaningful engagement," "linguistic equity in content moderation," "increased transparency," "clarity about so-called 'Terrorist and Violent Extremist Content' (TVEC)," and "multistakeholder debriefs" to "take stock" and compile lessons learned after events such as those unfolding in Ukraine or in past in Syria, Yemen, India, Palestine and other places.

Some of these requests boil down to more active, human engagement from the tech platforms, which despite their profitability have typically preferred to invest in the minimum viable solution, such as outsourcing content moderation to contract firms that hire people at low wages. Investing more in building "meaningful relationships with civil society globally that are based not on extraction of information to improve products, but also provide civil society with meaningful opportunities to shape platform tools and policies" is a more expensive proposition for the platforms, which must satisfy investors who seek outsized rewards from tech investments.

Whether the tech platforms will, belatedly, prioritize human rights considerations over profits remains to be seen. Maksym Dvorovyi says the platforms can make gains in this area, when they are willing.

"Platforms applied some of the measures they were previously reluctant to apply in response to the Russian aggression," said Dvorovyi. "We observed geoblocking of certain pages, enhanced privacy protection for Ukrainian users, and more desire to seek for and cooperate with the local NGOs. I wish it occurred before the war's re-escalation. Ukrainian civil society highlighted inadequate moderation on all platforms since 2014, but to no avail."


Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...