Search Articles

Search Results containing

Not Every Investigation Works: Lessons from a Failed Open Source Project

In September 2019, Huthi forces in Yemen attacked the Aramco Biqayq oil facility in Saudi Arabia using unmanned aerial vehicles (UAVs) or “drones”. 

Amnesty International saw this event, and other Huthi cross-border attacks on Saudi Arabian civilian infrastructure, as potentially indicative of a new dimension to Huthi campaigns and  a possible violation of international humanitarian law – and, as such, worthy of further investigation. 

As with many open source investigations, it did not go quite to plan – bringing some central doubts and questions which arise during any open source investigation to the fore. 

Amnesty’s Digital Verification Corps (DVC) was asked to use open source investigative techniques to look into similar reported events and find evidence of further drone strikes on Saudi Arabian territory – in particular against crucial infrastructure. As part of this project, our DVC team at the Centre for Governance and Human Rights, based at the University of Cambridge, was tasked with gathering open source material on a series of drone strikes in July 2019 to corroborate reports from the ground and help evaluate their impact and lawfulness. However, as with many open source investigations, it did not go quite to plan – bringing some central doubts and questions, which arise during any open source investigation, to the fore. 

Starting the project

Our team at the Cambridge DVC began, as always, by studying the context: reading published research by Amnesty and other organisations on Yemen’s civil war, the rise of the Huthis, and Saudi Arabia’s support – mainly through arms shipments – for Yemen’s UN-recognised government in its military campaign against Huthi forces. 

We started our work with the initial research phase – the process of gathering open source content depicting the reported attacks in Saudi Arabia from Huthi-held territory in Yemen. It soon became clear that much of the content we were collecting from Facebook, Twitter and YouTube (the most popular social media platforms in Saudi Arabia) did not provide the type of information sought in this investigation. The information we were looking for included potential evidence of civilian casualties, damage to civilian infrastructure, and evidence of other crimes under international humanitarian law (the laws governing armed conflict). 

Instead, our discovery yielded several different types of content. First, we found videos in which the camera panned towards the sky, often taken at night and lacking sufficient context to verify their authenticity in relation to the incident under investigation. Second was an element specific to this investigation which few of us had encountered before.This was the large amount of doctoring, collaging and generally loose approach to the ‘truth’ in content posted online. This made the process of verifying some videos a complex process of assemblage, with members of our team often trying to isolate and decipher one clip within a montage of footage showing human rights abuses.

Research challenges

As our research progressed, we increasingly discovered content   supposedly showing civilian casualties but which, upon analysis, was unverifiable, leading to our first serious doubts about the viability of the investigation. Videos were either too brief to give any insight into the location of the events recorded, or were screen recordings of content that had since been deleted, complicating the process of reverse image searching. In cases where there were good views of the location, it was too often rural and without distinguishing landmarks, making it impossible to tell the content apart from that captured in other conflicts.

Example of the types of difficult-to-verify content discovered in the research
for the project

There was simply not enough content available which ticked all the boxes: that showed civilian casualties or infrastructure damage in Saudi Arabia, was filmed in the month of July 2019, and could be connected to a verifiable Huthi drone strike. A potential video of a strike would not depict civilian casualties, for example, or we would be unable to find corroborative content of the same event. The further challenges of collaged or unclear videos further complicated our efforts, slowing and frustrating us while making it a challenge to develop a clear picture of the attacks. 

After several weeks of content sourcing and verification sessions, we submitted our initial report to Amnesty International. We had to report that we had failed to verify and corroborate the events we’d been asked to examine. After the report was submitted, it was decided to halt all work on the project. Too much time had been expended for too few results, and we agreed that the Cambridge DVC’s resources would be better used elsewhere.

Lessons learned

There are many lessons learned that will be applicable to similar projects.

Lesson one:

First of all, we realise that we should have collectively voiced our struggles earlier in the process. This would have enabled us to flag common problems and develop a certain solidarity by working through them. This was particularly true in the context of the conflict we were researching. With the Saudi Arabian government not wishing to publicise attacks inside their borders, and no content posted by the Huthi forces to be found, the content we did find was often doctored or reposted footage of other events. 

Lesson two:

The second major lesson learned was a challenge that all open source investigators can face, no matter how successful they may be. Unfortunately, there is rarely a single answer to any given question, and many tasks can become endless if they are allowed to. In a process that is intended to assign objectivity to digital media, the subjective decisions we made throughout the investigation were ultimately what defined it. This means that knowing how best to focus efforts, how to combine tools and resources, and when to reroute a research path down a different avenue of investigation are key. 

Knowing when to move on

Consequently, what we learned is that one of the greatest skills an open source investigator can have is knowing when it is time to move on. Whether on an individual video which at first seems verifiable but which, three hours later, seems less so, or on a project which spans months and requires a team of a dozen people, not every investigation works, and knowing when to stop is a vital part of the process. 

Related Articles

+ VIEW MORE