IoT is being weaponized. The same sensors, networks and real-time data analysis used monitoring classrooms can morph into weapons for targeted killing. How do such malicious drones operate and what can be done to protest against their airborne threat?\nBackground\nHere are three data-points of weaponized drones.\n\n The recent assassination attempt on the President of Venezuela with drones. \u201cAug 4, 2018. CARACAS, Venezuela \u2014 A drone attack caused pandemonium at a military ceremony where President Nicol\u00e1s Maduro of Venezuela was speaking on Saturday, sending National Guard troops scurrying in what administration officials called an assassination attempt.\u201d\n The use of drones to shoot down incendiary kites in the Israeli-Palestinian conflict. \u201dIDF reservists to help; troops able to shoot down flying objects 40 seconds from detection\u201d\n Slaugtherbots. \u201cA video by the Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley presenting a dramatized near-future scenario where swarms of inexpensive microdrones use artificial intelligence and facial recognition to assassinate political opponents based on preprogrammed criteria.\u201d\n\nHow do they work?\nDrones are aerial IoT devices. They\u2019re mounted with sensors that relay their location, altitude and other sensor readings such as images to a back-end system or controller which determines what action the drone should take. Such drones have to remain within sight for a human control to operate them. The Federal Aviation Administration (FAA) stipulates that Unmanned Aircraft Systems (UAS) users must (1) Register their UAS with the FAA (2) Fly the UAS within visual line-of-sight. The examples above with the Venezuelan assassination attempt and shooting down incendiary kites both involve human controllers.\nCommercial drones used are used to inspect pipelines for leaks. They can fly long distances looking for signs of an oil leak on their own and the images of leaks are easily recognizable. They can travel further because they do not require onboard power source for long range transmission with their pilot. Extending pattern recognition for oil leaks to facial recognition to identify a human target is not as big a hurdle as you would think. Chip technology is advancing so that small on-board chips could enable to a drone to find a target within a crowd on its own.\nThis isn\u2019t science fiction as The Perpetual Line-Up explains, \u201cthe Government Accountability Office revealed that close to 64 million Americans do not have a say in the matter: 16 states let the FBI use face recognition technology to compare the faces of suspected criminals to their driver\u2019s license and ID photos, creating a virtual line-up of their state residents. In this line-up, it\u2019s not a human that points to the suspect\u2014it\u2019s an algorithm.\u201d The Verge reports \u201ca major recipient of AI funding in China is facial recognition. This technology is widespread in the country\u2019s cities, used for everything from identifying jaywalkers to allocating toilet paper. More significantly, it\u2019s also been embraced by the government as a tool for surveillance and tracking\u201d.\nSimilar artificial intelligence (AI) enables drones to operate autonomously. They can be programmed with a route or instructions and then navigate to the destination on their own. This makes it possible for a swarm of drones to operate collectively, without human operators.\nDefensive strategies\nHow do you protect against such a \u2018smart\u2019 weapon? Here are three possible defenses:\n1. Block communications\nPilots communicate with their drone using a transmitted C2 link. WhiteFox Defense provides a RF counter-drone security which constantly surveys for these signals and analyzes them in real-time to determine the danger posed by a drone. This information can be used to lock out the drone pilot and mitigate the threat. This approach has limited value though, when a drone is operating autonomously and not in regular communications with it pilot as there is not signal to block.\n2. Airspace monitoring\nThis approach resembles anti-virus software where network packets are compared against a list of known virus \u2018signatures\u2019 to identify threats and block them. Unmanned Defence Specialists provides software that continuously displays real time airspace information and detects and identifies drones using "DroneDNA" pattern recognition. Defensive measures against hostile drones can be taken automatically with this information. These systems however, work best in areas with a known boundary and may not be suitable for protecting large areas.\n3. Policy changes to weaponizing drones\nThe Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley released the video \u2018Slaughterbots\u2019. The Future of Life Institute (FLI) is a volunteer-run research and outreach organization that works to mitigate existential risks facing humanity, particularly existential risk from advanced artificial intelligence(AI). "This short film is more than just speculation; it shows the results of integrating and miniaturizing technologies that we already have... AI's potential to benefit humanity is enormous, even in defense, but allowing machines to choose to kill humans will be devastating to our security and freedom", explains Russell.\nPaul Scharre author of Army of None: Autonomous Weapons and the Future of War disagrees with Russell. "Every military technology has a countermeasure, and countermeasures against small drones aren't even hypothetical. The U.S. government is actively working on ways to shoot down, jam, fry, hack, ensnare, or otherwise defeat small drones. The microdrones in the video could be defeated by something as simple as chicken wire.", explained Scharre also stated that Russell's implied proposal, a legally binding treaty banning autonomous weapons, "won't solve the real problems humanity faces as autonomy advances in weapons. A ban won't stop terrorists from fashioning crude DIY robotic weapons\u201d.\nSummary\n\u201cJust as the Industrial Revolution spurred the creation of powerful and destructive machines like airplanes and tanks that diminished the role of individual soldiers, artificial intelligence technology is enabling the Pentagon to reorder the places of man and machine on the battlefield the same way it is transforming ordinary life with computers that can see, hear and speak and cars that can drive themselves\u201d, reported the NYT. \u201cThe new weapons would offer speed and precision unmatched by any human while reducing the number \u2014 and cost \u2014 of soldiers and pilots exposed to potential death and dismemberment in battle. The challenge for the Pentagon is to ensure that the weapons are reliable partners for humans and not potential threats to them.\u201d\nKiller drones may not be here yet. But they\u2019re closer than you\u2019d think.