Three Ways the Evidence Lab Responded to the Earthquakes in Türkiye and Northern Syria

The catastrophic earthquakes that struck south-eastern Türkiye and northern Syria on 6 February and again on 20 February devastated entire communities. More than 48,000 people were killed and over 100,000 others were injured.

When a crisis occurs, Amnesty International responds by conducting research into potential human rights abuses, and calling for a concerted effort towards the promotion and protection of human rights for everyone. As the crisis unfolded, our regional offices led on country-based expertise and were supported by thematic specialists focused on specific issues such as the rights of migrants and refugees. Researchers, including from the Crisis Team, traveled to four of the affected provinces in southern Türkiye. In the Evidence Lab, we researched the crisis as we saw it unfold online.

In this vast and evolving digital landscape, not everything could be used as evidence of human rights abuses

The February earthquakes generated a massive digital visual record. People filmed rescue attempts. Drone footage captured damaged cities. Yet, in this vast and evolving digital landscape, not everything could be used as evidence of human rights abuses. It is our team’s responsibility to work across the movement to discern what content can be used as evidence of human rights abuses in Amnesty International investigations.

So far, our digital investigations into the Türkiye and Syria earthquakes have taken three forms. In each case, we collaborated with different teams, and contributed to the research publication in very different ways:

Rapid response verification: access to humanitarian aid in northern Syria

At the end of February, a request came in from researchers in Amnesty International’s Middle East and North Africa Regional Office. They were compiling evidence for a press release on earthquake aid in northern Syria being blocked or diverted. As supporting evidence for testimony they’d gathered, they wanted to know if we could verify a video they’d been sent. It showed people believed to be Syrian National Army shooting in the air to disperse a crowd of people trying to pull aid boxes from the truck of a humanitarian organization.

We investigated the video to determine its authenticity. In this instance, two pieces of contextual information were important:

  1. Was this captured after the 6 February earthquakes (chronolocation)?
  2. Was this captured in an area affected by the earthquakes (geolocation)?

The video was allegedly captured in Jinderes, northern Syria. Geolocation usually relies on cross referencing features like street signs, vegetation, and buildings. In the wake of the earthquakes, this work became more difficult. The area we were looking at had extensive building collapse, shifting our usual reference points. To overcome this, we compared images and videos from before and after the earthquakes to build a clearer picture of what the area looks like. We complemented this with research on social media: using relevant keywords, we found other videos of the same event.

By combining these methods, we were able to confirm that the area shown in the video was impacted by the earthquakes, and that the footage was captured after 6 February, when the earthquakes struck. Using details of the humanitarian organization involved, language and flags, we also confirmed that this was likely captured in Jinderes. Through verifying the location and date of the video, we were able to corroborate the testimony gathered. By adapting to the information available, we overcame the challenges of doing open source research on an earthquake-affected area.

Satellite imagery over Jinderes, Northern Syria, 15 February 2023. The yellow boxes indicate some areas where earthquake damage is clearly visible. The arrow points to some makeshift camps. Copyright: 2023 Planet, Inc.

Longer-term collaboration: integrating video analysis with testimony of torture

While a lot of the content online documented the suffering caused by the natural disaster of the earthquakes, videos also emerged online showing an alarming human rights concern. In a joint statement, Amnesty International and Human Rights Watch documented how law enforcement officials in the region beat, tortured, and otherwise ill-treated people they suspected of theft and looting.

We collaborated with the Türkiye team within Amnesty International and Human Rights Watch to take a deeper look. The Evidence Lab focused on collecting and analyzing videos, while our colleagues interviewed victims of and witnesses to specific incidents.

The press release summarized our conclusions in a few succinct sentences, but the process of analyzing videos often takes weeks. As part of this process, we ask ourselves how a video functions as evidence: is it standalone proof of a human rights abuse? Or does it support and provide context for testimony?

In some cases, videos provide context on the location, witness identities, and nature of a violation. In other cases, while a video may have been a clear example of police violence, we couldn’t independently corroborate when and where it was filmed (geo and chronolocation). We’d often have a video of an incident but would be missing some key context, limiting what we could say about it. Going through the verification processes was necessary to determine what, if, and how we could comment on the actions in the videos.

To overcome some of these challenges, we worked closely with researchers to see how videos and interviews could corroborate each other, and we focused on the cases where we had the strongest, most compelling evidence. Much of our work responding to crises involves ruling out what we can’t say based on digital evidence, as well as determining what we can say.

Open-ended scoping: social media monitoring to support researchers doing on-the-ground investigations

As two of Amnesty International’s Senior Crisis Advisors and a Rapid Response Researcher from the Europe Regional Office prepared for research on the ground in Türkiye, the Evidence Lab organized volunteers from our Digital Verification Corps (DVC) to help monitor the humanitarian situation via social media. Combining testimony with digital research is a core part of our work. Integrating digital research early on can help overcome access issues and inform the direction of the overall research.

The DVC most often works on long-term and large-scale projects with a clearly defined output (such as their verification work for Ban the Scan: Automated Apartheid). This was an exception, where having a small team of dedicated researchers with Arabic and Turkish language skills was a vital contribution to the monitoring effort. The team of three DVC members worked remotely to find, document and translate content. They built a picture of the situation as a whole, documenting temporary camps, water supply, and other factors of access to and distribution of humanitarian aid.

Thumbnails from social media photos and videos collected by our DVC volunteers.

While our DVC volunteers focused on these broad themes, the researchers on the ground responded to what they were seeing: the rights and needs of people with disabilities were being neglected in the humanitarian response to the earthquakes. This is something that unfortunately happens far too often in humanitarian responses to both armed conflicts and natural disasters. As a result, their research focus narrowed to document how displaced people with disabilities are living in inadequate camp conditions that undermine their health, autonomy and dignity, among other rights that are not being respected. This is due to – among several factors – a lack of sanitation facilities that are accessible for people with disabilities and a shortage in adequate specialist services and assistive equipment.

Our research contribution to this project encapsulates both the opportunities and limitations of open source research. Open source can be timely and flexible, giving visibility on areas and themes that might otherwise be hidden. Yet, simultaneously it can be hard to predict what you’ll find and the data you access can give an incomplete picture of what’s actually happening. It underscores the significance of Evidence Lab’s close collaboration with researchers gathering testimony, and points to how important open source research isn’t always published. Ruling things out and narrowing the scope of research is a significant, often invisible, part of our work.