Search Articles

Search Results containing

Facebook Just Blinded War Crimes Investigators

When human rights crises break, Amnesty International aims to deploy staff to the scene to witness first-hand what is happening and expose violations. But sometimes security, diplomatic or administrative issues prevent us from doing so. That’s when we turn to remote tools – including social media platforms – to monitor what is happening in a crisis in real time.

All over the world, grassroots human rights defenders are taking huge risks to film videos of human rights violations and share them on the channels they know can amplify their voices the most – such as Facebook, YouTube and Twitter.

These platforms were all built on the premise of democratizing information, promising a new marketplace for sharing ideas and building connections between individuals in diverse regions of the world. They lured in human rights defenders with a promise: “Put your content here, and the world will see what is happening in your community.” So people posted photos and videos of the worst kinds of abuses – extrajudicial executions, barrel bombs, torture – providing some of the vital evidence we need to hold perpetrators to account.

Human rights organizations responded and adapted to this new environment. Today, we rely on these digital coffee houses that social media platforms have built. Amnesty International’s research increasingly integrates eyewitness interviews with corroborating social media content. One example is the Digital Verification Corps, a network of students we set up and are training to monitor, discover, verify and corroborate evidence of human rights abuses on social media platforms.

Recently, however, social media platforms have begun to change the tools we have come to rely on, with little or no consultation. Many of those platforms have, over the years, invited us to workshops and conversations, telling us how much they value our work and how much they respect human rights. We believed them. But suddenly they are pulling the rug out from under us, hindering our ability to protect human rights.

Just this month, the entire open source investigative community was left reeling by a sudden, unannounced change in Facebook’s search functionality. Graph Search is a Facebook tool that allowed investigators to find the publicly available content that otherwise would be buried – much like a needle in a haystack.

Consider the case of Mahmoud al-Werfalli, a ruthless former armed group leader from Libya who is wanted by the International Criminal Court, based on videos found mostly on Facebook. Civil society researchers alerted legal investigators after using Graph Search to find videos documenting al-Werfalli carrying out or ordering extra-judicial executions. Without Graph Search, we wouldn’t have found certain damning pieces of evidence surrounding the Myanmar military’s crimes against humanity and possible genocide against the Rohingya in Myanmar in late 2017. We wouldn’t have found the videos from a hospital bombed by the Assad regime in Syria’s Idlib, corroborating testimony from the doctor who saw his healthcare facility destroyed.

Now Facebook has turned off Graph Search, with potentially disastrous results.

This is not the first time a social media company has betrayed the human rights community. In mid-2017, under pressure from governments to remove content that could depict or glorify terrorism, YouTube started removing masses of videos from Syria from its platform. The Syrian Archive, a Berlin-based NGO established to catalogue the crimes of the Syrian conflict in the hope of eventual accountability, lost hundreds of thousands of videos that civilians on the ground had taken risks to post to social media. After the human rights community got together and intervened, YouTube relented and returned some of those videos. But its takedown policy remains in place, meaning it is an ongoing battle to save this kind of content.

In the worst-case scenarios, algorithms will be able to remove these videos almost as quickly as human rights defenders can post them, with possibly devastating impact for investigators. We can’t ask for a video to be reinstated, or used to build a case against a warlord, if we never knew it was there in the first place.

In 2018, Google Earth dealt a blow to the human rights community when it removed an amazing resource called Panoramio. Integrated into Google Earth Pro (one of two tools every human rights investigator online should have), it allowed human rights researchers to go back and look at holiday pictures online from people who had visited, say, Aleppo before 2010, or parts of Nigeria and Cameroon now engulfed in conflict. This assisted us in the time-consuming work of establishing where a video of an airstrike had been filmed, where a torture scene took place, or where a trafficking victim was last seen.

We have nothing against the platforms improving privacy – it’s one of the human rights we care about. But they are hardly striking the right balance by just dropping functions altogether, without any consultation with human rights investigators. And it certainly doesn’t foster trust in their claim that they support human rights.

A Facebook spokesperson said the company had “pause[d] some aspects of graph search” and is working “closely with researchers to make sure they have the tools they need”.

We are available to discuss and even work with social media platforms to help strike a balance between respecting privacy and helping human rights defenders and survivors of abuse who are trying, in the hardest situations, to make their voices heard. We just wish that platforms would recognize the role they could, and should, be playing in documenting these stories and fighting for justice.

This article first appeared in Newsweek.

Related Articles