The US National Security Agencys Skynet project uses metadata to help decide who is a target but is it technologically sound?
Guns dont kill people, is the standard refrain of the National Rifle Association every time there is a mass shooting atrocity in the US. People kill people. Er, yes, but they do it with guns. Firearms are old technology, though. What about updating the proposition from 1791 (when the second amendment to the US constitution, which protects the right to bear arms, was ratified) to our own time? How about this, for example: algorithms kill people?
Sounds a bit extreme? Well, in April 2014, at a symposium at Johns Hopkins University, General Michael Hayden, a former director of both the CIA and the NSA, said this: We kill people based on metadata. He then qualified that stark assertion by reassuring the audience that the US government doesnt kill American citizens on the basis of their metadata. They only kill foreigners.
Pakistanis, specifically. It turns out that the NSA hoovers up all the metadata of 55m mobile phone users in Pakistan and then feeds them into a machine-learning algorithm which supposedly identifies likely couriers working to shuttle messages and information between terrorists. We know this because of one of the Snowden revelations published by the Intercept, the online publication edited by Glenn Greenwald, Laura Poitras and Jeremy Scahill and funded by eBay founder Pierre Omidyar.
The NSA programme doing this is called Skynet, is not to be confused with the murderous intelligent machine network in the Terminator films. In essence, its a standard-issue machine-learning project. What happens is that the algorithm is fed the mobile metadata of a number of known terrorist suspects, and then sifts through the data of 55m users to try and find patterns that match those of the training set. Its the same kind of approach that drives your spam filter: its fed examples of known spam, and then uses that to decide whether a particular message is junk mail or not. The critical difference is that if your filter gets it wrong, then the worst that can happen is that you are annoyed or amused by its clumsiness; if Skynet gets it wrong you could find yourself on the receiving end of a Hellfire missile dispatched by a Predator or a Reaper drone.
For Pakistani citizens, this is still a remote but not an entirely improbable possibility. According to the Bureau of Investigative Journalism which monitors drone strikes more than 2,400 people in Pakistan, Yemen and Somalia have been killed by such strikes in the five-year period 2010-2014. This is the sharp end of the so-called war on terror, as the US brings the war to its adversaries in Afghanistan, Pakistan and Yemen, and it raises interesting questions about the legality of extrajudicial killing which, so far, does not appear to trouble the US administration unduly.
We know a little but not much about the process by which individuals are placed on the kill list that President Obama personally approves every week. According to a leaked official study conducted in 2013 and published by the Intercept, which reported that US intelligence personnel collect information on potential targets drawn from government watchlists and the work of intelligence, military, and law enforcement agencies. At the time of the study, when someone was destined for the kill list, intelligence analysts created a portrait of a suspect and the threat that person posed, pulling it together in a condensed format known as a baseball card. That information was then bundled with operational information and packaged in a target information folder to be staffed up to higher echelons for action. On average, it took 58 days for the president to sign off on a target, one slide indicates. At that point, US forces had 60 days to carry out the strike.
Its likely then that the output of the Skynet algorithm is just one of the considerations that goes in to identifying an individual at whom a drone strike could be targeted. So at the moment its probably inaccurate to say that this particular algorithm kills people; the decision to strike is still made by a human being. Nevertheless, its important to ask how good the algorithm is at its job.
Not great, is the answer provided by Patrick Ball, a data scientist and the director of research at the HumanRights Data Analysis Group, who has previously given expert testimony before war crimes tribunals. He has studied the Snowden documents and uncovered a flaw in how the NSA trains Skynets machine-learning algorithm which leads him to describe its outputs as scientifically unsound and ridiculously optimistic.
So maybe algorithms dont kill people yet. They just put them on lists of candidates for extrajudicial killing. Maybe we should be grateful for such small mercies.