Droning On and On

Who’s your favourite bad guy in the movies? There are so many to choose from: Blofelt, Hannibal Lector, Dr Moriarty, Sauron and Lord Voldemort to name just a few. The various and assorted baddies are very different in a host of ways. Of course, they’re all taken from the realms of fiction, but some are more believable than others. Some are serial killers, some evil geniuses, some malevolent magicians, some cat-stroking, Mao-suit wearing manipulators.

However, one thing that they have in common is that they are identifiable. You can point to Moriarty and say, “There. It’s him wot done it”. In the movies, the bad guys are almost always distinguishable, and because they’re distinguishable there can be a plan to stop them. The good guys know where to start when they’re trying to save the World. In short, they’re identifiable, and so can be held accountable for their actions.

In my opinion, one of the scariest bad guys to grace the movies appears in the Terminator series. Now, as intimidating as Arnie can be in tight-fitting leather, I’m not talking about the Terminator itself. Rather it’s the bland-sounding SkyNet which gives me the heebie-jeebies. For those who’ve not had the pleasure of seeing the films, SkyNet is the artificial intelligence system which runs much of the US military, including the network of nuclear missile silos. In Terminator 2: Judgement Day, we learn how SkyNet has become self-aware and, in an act of what it regards to be self-preservation, begins to attack the human race with all weaponry at its disposal. In due course, this process of annihilation leads to SkyNet creating the Terminator as an efficient human killing machine. Shivers.

You could be forgiven for having missed it, but earlier this year the UK military, in partnership with BAE,  announced a technological advancement in aeronautical engineering which could have far-reaching ethical implications for the way that we fight our wars. The Taranis Unmanned Aerial Vehicle (or Drone, to you and me) is the most advanced armed robot plane ever built (that we know about). It will fly faster and higher than any drone before it, will be radar proof, and looks like something that the aliens would fly in the movie Independence Day:

Taranis Drone

Now, it could be argued that this development is only the latest in the history of the mechanisation of warfare; from wooden catapults and javelins to tanks and tomahawk missiles. But I believe that in the creation of unmanned (or womanned) drones, the militaries of the world have taken a step into an ethical minefield (excuse the pun) that we could all do without.

Over the last ten years, particularly in Iraq, Afghanistan and the Arabian peninsula, we’ve seen drones increasingly-used to fight wars – specifically the kind of ‘asymmetric’ warfare that sees a nation-state fighting an extremist ideology that finds its heartland in an almost mediaeval context. Indeed, perhaps the real asymmetry here is the sight of the most technologically-advanced military hardware ever made, hovering over communities which lack even the most basic of amenities, looking for a target. This is a well-worn path: think Vietnam and Napalm bombing.

But the truth is that the increasing use of drones by our military is pernicious, regardless of the target. Were drones being used by the Russian military in the Crimea to attack their Ukrainian counterparts, the moral complexity would remain. So what makes a drone so very different from other weapons of war?

Well in some senses the fictional equivalent of Taranis is SkyNet. Now don’t get me wrong: I’m not saying that Taranis is going to become self-aware in the near future and start shooting up a Sainsbury’s near you. Indeed, it’s not the robots that I’m worried about. As ever, it’s the humans who are directing them – and above all the humans who are sitting in their flight path.

As I’ve already noted, even a stick can be used as a weapon. But what is the ethical comparison between the military using say, a Tank, versus drones as weapons? There are three broad issues that make drones ethically different from Tanks:

1. Imminence – Drones have the latest surveillance equipment going. Nose cameras, night vision, heat sensing: the works. But even still, can any of these match the capacity of the human brain to sense danger or to exercise caution? Can they tell the difference between an aggressive move and a nervous one from 10,000 feet? Can they do any of these to the same level of certainty that a tank crew can from half a mile away from the target? I don’t think so. And what’s more, in their proximity, the tank crew can utilise their intelligence of how to respond to an unfolding situation in a way that a drone – with an operator who is potentially thousands of miles away – cannot. To emphasise this, the estimated hit-rate for the primary target in a drone strike is 1.5-2%.

2. Risk – This may sound terribly old-fashioned, but surely there should be some risk involved to ourselves when we fight our wars? Of course, we don’t want anyone serving within our military to be in harm’s way. But it kind of comes with the territory when you sign up. The broader point is what could potentially happen if we increasingly fight wars without any risk to ourselves. It was with Vietnam – and then the Falklands – that we really started experiencing our wars through the TV. During the First Gulf War, General Schwarzkopf turned it into a Discovery Channel documentary series on laser-guided missiles. In the Second Gulf War things went a bit more Hollywood with Shock and Awe. In Afghanistan we have seen a plethora of Reality TV-style documentaries from the front line. I believe that this is increasingly de-sensitising us to the reality of the life and death decisions that we are witnessing. And this is true in an even more acute sense with drones. I worry about the decisions that the drone operators are making as they watch the video being streamed back from a battlefield half way around the world. It is too easy to make life and death decisions in a battle when there is no risk to your own health or safety. Ultimately, how different does it really feel from Call of Duty? It’s very different to those on the ground, but those in the Control Room can’t feel the same sense of risk and heightened reality of a tank crew sitting on the field of battle.

 3. Accountability – This builds on the previous two points. When you remove soldiers from the battlefield completely and yet enable them to retain the level of targeted, deadly force that is possible with drones, you potentially begin to confuse the lines of accountability for specific actions. The soldier firing the ‘trigger’ may not immediately be obvious – there may be several people in the chain of events leading to weapons being fired. Any or all of them could be said to have made the decision on the use of deadly force in a quickly-unfolding battlefield scenario. Like SkyNet – and unlike Blofeld – it is hard to trace the identity of the bad guys. And this is at the root of my growing unease as we’ve seen the increasing use of drones by the British military. Each time that there has been a friendly fire or civilian casualty as the result of a drone strike, questions like ‘How could this happen?’, ‘What could have been done differently’ and crucially ‘Who is to blame?’ are just so much harder to answer straightforwardly than similar situations in which soldiers (e.g. our fictional tank crew) were in the immediate vicinity to the misuse of force.

And this is no seminar room debate: the implications and the evidence make for stark reading.  Droneswatch estimate that up to 4,000 people have been killed through the use of drone strikes in Pakistan, Yemen and Somalia alone, with as many as 954 being civilians and 225 children.

Is there a fly in the ethical ointment when we consider missiles? On the surface they seem like a similar concern when we think of imminence, risk and accountability; surely a missile strike from a ship in the Persian Gulf is far too easy to do. And it is. But it differs from a drone strike in that it does not claim the aura of being the next best thing to boots on the ground, or of being a weapon that can be used calmly, repeatedly and locally. This impression seems to be falsely placed when it comes to drones, but that is not what the PR says….

Weaponry has been in this world since early man (and I do mean man) was able to pick up a stick. Sadly, the need for weapons will continue so long as there are human beings on this earth. But combat drones like Taranis do not represent a positive evolution of the armoury available to progressive nations.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s