Published: Written By: Ed Roddis & Megha Tayal Narang

In 1989, Back to the Future II saw Michael J. Fox travel forward in time to a futuristic 2015. Some of the gadgets that feature in the film – like hands-free computer games, video chat and glasses that could take phone calls – must have seemed like fanciful predictions in 1989. But twenty six years later, we know them as Xbox Kinect, Facetime and the prototyped Google Glass. So while we are yet to see kids on hovering skateboards, Back to the Future II didn’t turn out to be completely wrong about life in 2015.

The film also features hovering drones undertaking domestic duties, and Deloitte’s TMT Predictions report suggests more than one million will be active in the civilian world by the end of this year.[1] They range from hobbyist toys controlled by smartphones to models in commercial use that can fly for up to an hour.

The enormous potential for drone use in the public sector is well documented by Gov2020. They could complement search and rescue operations, help monitor crowds or provide automated security. They could help inspect buildings or other structures, survey crops and even plant trees.[2] They also represent a significant economic opportunity. In Europe alone, the Aerospace and Defence Industries’ Association suggests that drones could create 150,000 jobs by 2050[3].

But the trouble with drones is that the speed of their development and growth of their availability has outpaced governments and regulators’ ability to respond. They pose significant safety concerns and raise important privacy questions. Most worryingly of all, they represent a major security

That threat came into sharp focus last month when a drone apparently containing traces of radiation was found on the Prime Minister of Japan’s office roof.[4] That is just the latest in a series of alarming and increasingly frequent news stories. In France, 60 drones have been spotted around Paris as well as the country’s nuclear sites. In Alaska, parents complained to police after a quadcopter drone followed children home from school. In Washington, a similar device landed in the White House grounds. In Mexico, a drone overladen with three kilogrammes of crystal meth crashed into a supermarket car park next to the US border. And around the world, airline pilots are reporting a surge in near-collisions that could cause catastrophe if a drone was sucked into a jet engine[5].

These stories give a strong indication of the drones’ dark side. Their potential for accidental, criminal or malevolent incidents is as far-reaching as their potential for good but, as the UK’s Royal Institute of International Affairs has pointed out, the threat they pose is “under-analysed, unarticulated and underestimated”.[6] Some commercially available drones can carry a payload of three kilogrammes – that is about six times the weight of explosives used in an anti-personnel landmine.

Drones therefore leave regulators and governments with a challenge: how do they manage the security, safety and privacy risks while allowing the opportunities to be seized?

Regulators around the world are grappling with the issues and catching up fast. Earlier this year, the US Federal Aviation Administration (FAA) proposed regulations to allow drones to be used commercially as long as they fly in daylight and remain in their operator’s line of sight. So, no e-commerce deliveries by autonomous drones for now. In Canada, the Privacy Commissioner questioned the privacy implications of having Canadian skies “filled with hovering data-collecting robots”[7]. In the UK, a Parliamentary committee recommended that all drones be fitted with the geo-fencing technology already found in some models, which mean they do not work near sensitive sites.[8] Other activities, such as the annual US military ‘Black Dart’ events, are working out how to reduce the security threat[9].

Governments need to keep exploring the potential of drones but, along with regulators, need to invest in managing risk and security issues. Because the number of near-incidents involving drones suggests that we may well ask when – not if – one will be involved in a tragedy, whether by accident or design.


Ed Roddis, Head of Government and Public Sector research – Deloitte UK

Megha Tayal Narang, Government and Public Sector Analyst – Deloitte UK


[1]TMT Predictions 2015, Deloitte Touche Tohmatsu Services, Inc, January 2015.

[2]Ex-Nasa man to plant one billion trees a year using drones, The Independent, April 2015.

[3] Civilian use of drones in the EU, House of Lords, March 2015.

[4]Tokyo police studying technology to detect, capture drones, Japan Times, April 2015.

[5]Near-collisions between drones and airliners surge, new FAA reports show, Washington Post, November 2014.

[6] Drones: disembodied aerial warfare and the unarticulated threat, David Hastings Dunn, Royal Institute For International Affairs, September 2013.

[7] Drones in Canada: Will the proliferation of domestic drone use in Canada raise new concerns for privacy?, Office of the Privacy Commission of Canada, March 2013.

[8] Civilian use of drones in the EU, House of Lords, March 2015.

[9]Marine Corps develops futuristic UAV capabilities, Marine Corps Times, April 2015.