A Twitter post with a video of a robot dog firing an unloaded gun. almost 120,000 likes since July. Video of Ukrainian soldiers apparently change ready drones for landing weapons. An art project featuring Spot, the Boston Dynamics robot best known for its viral dance videos. equipped with a paintball gun.
Videos like this are everywhere. the Internet. They showcase the scary scenarios that six of the leading robot makers, including Boston Dynamics, probably had in mind when they published letter last week they promised not to use their products as weapons. As robots become more accessible to consumers, these companies warn, people may try to turn them into weapons designed to harm humans. To prevent this from happening, the companies have promised to analyze what customers want to do with their commercial robots before selling them (“when possible”), and first look at developing technologies that could reduce the risk of this.
“[W]“We do not support the armament of our versatile, highly mobile robots,” the companies wrote. “[W]Now we feel a new urgency in light of the growing public unrest in recent months caused by the small number of people who have clearly gone public with their impromptu efforts to weaponize commercially available robots.”
Robots available to the general public are still somewhat expensive and not as common as other types of commercial technology that people can buy (like drones). However, this letter serves as a reminder that the risk of using weapons is not entirely zero, and that this is an issue that robot manufacturers are already worried about. However, at the same time, these companies left many caveats in their declarations and left the door open to continue selling robots to law enforcement and the military. They are also far from only manufacturers the creation of such technologies that are gradually entering the mainstream.
“You don’t necessarily want the public to believe that you are producing a product and then deliberately using it for military purposes,” explains Eric Lin-Greenberg, a professor at the Massachusetts Institute of Technology who studies the impact of new military technologies on conflicts. “Whether such a statement would actually affect how these systems are used, I think that’s another question. In fact, these are just ready-made technologies.”
An international campaign called Stop Killer Robots urged people to resist the development of autonomous weapons and highlighted how racism, sexism and dehumanization can be built into these technologies. One former New York City Council member, Ben Callos, proposed a law preventing police from acquiring any armed robots last year after the New York City Police Department began testing a Boston Dynamics robot. (This pilot project was canceled due to backlash.) Electronic Frontier Foundation, a digital rights organization, called for banning law enforcement from using autonomous or remotely controlled robots.
Even some non-traditional solutions have attracted the attention of robot manufacturers. In 2021, an art collective called MSCHF purchased Spot, a Boston Dynamics robot dog for almost $75,000, to showcase a group called “Spot’s Rampage”. The project was to attach an airbrush to the robot and then invite people from all over the world to remotely control the “weapon”. Although a robot armed with a paintball gun eventually brokeBoston Dynamics, which offered the same robot model police departments as well as military in the past – was not happyand said the project misrepresents how its robot is “used to improve our daily lives.”
While off-the-shelf robots are still quite rare, drones have become more common and demonstrate how consumer technology can be used as a weapon. During the war in Ukraine, some soldiers turned to serial drones and used them to drop ammunition, including grenades and weapons. designed to destroy tanks. Cartels in Mexico have similarly used drones to transport and detonate explosives. Terrorist groups and other non-state actors can also upgrade these relatively simple technologies in their favorexplains Kerry Chavez, professor at Texas Tech University and project administrator for the University’s Peace, War, and Social Conflict Lab.
“Many of them are just hobbyist models and commercial models, even some homemade ones,” Chavez told Recode. “Even if you cut the supply chain off one vector, they might just activate another.”
The US Bureau of Alcohol, Tobacco, Firearms and Explosives did not respond to a request for comment on how common it thinks drone weapons are in the US, but we know it has happened. In 2015, an 18-year-old boy from Connecticut stirred global outrage and investigation after he fired a gun attached to a makeshift drone. In 2020, a Pennsylvania man was sentenced to prison for, among other things, using a drone to drop explosives in “terrorizethe woman he was dating. The Federal Aviation Administration has filed a lawsuit in at least one case involving a combat drone. According to agency spokesman Rick Breitenfeldt, operating a drone with dangerous weapons attached to it is illegal and carries a fine of up to $25,000.
What we’ve already seen with drones could make companies get ahead of arming more advanced consumer robots as a good idea. But there are critical caveats. First, companies seem to recognize that they cannot stop the misuse of their technology alone, and they are already turning to the government for help. At the same time, these technologies can still be used for other types of harm, such as spying on people or moving weapons across borders. Earlier this year Canadian police caught a drone carrying nearly a dozen pistols from the United States after it crashed into a tree in southern Ontario.
And there’s the biggest caveat: these companies limited their promise to “general purpose” robots, but noted that “we don’t mind existing technologies that countries and their government agencies use to protect themselves and enforce their laws.”
This story was first published in the Recode newsletter. Sign here so you don’t miss the next one!