More than 50 leading AI and robotics researchers have joined a boycott of South Korea’s KAIST university over the institute’s plans to help develop AI-powered weapons. The boycott was announced ahead of a UN meeting set in Geneva next week to discuss international restrictions on so-called “killer robots.” It marks an escalation in tactics from the part of the scientific community actively fighting for stronger controls on AI-controlled weaponry.
The boycott was organized by Professor Toby Walsh of the University of New South Wales, who warned in a press statement that the race to build autonomous weapons had already begun. “We can see prototypes of autonomous weapons under development today by many nations including the US, China, Russia, and the UK,” said Walsh. “We are locked into an arms race that no one wants to happen. KAIST’s actions will only accelerate this arms race. We cannot tolerate this.”
Signatories of the boycott include some of world’s leading AI researchers, most notably professors Geoffrey Hinton, Yoshua Bengio, and Jürgen Schmidhuber. The boycott forbids all contact and academic collaboration with KAIST until the university makes assurances that the weaponry it develops will have “meaningful human control.”
The trigger for the boycott was KAIST’s announcement in February that it was launching a joint research center with South Korean defense company Hanwha Systems. According to The Korean Times, the goal of the center is to “develop artificial intelligence (AI) technologies to be applied to military weapons” that would “search for and eliminate targets without human control.”
The partnership brings together two of the world’s leading robotics and military organizations. KAIST is a world-class research university, known for work such as the transforming DRC-HUBO robot, which won the 2015 DARPA robotics challenge.
Hanwha Systems, meanwhile, is the defense subsidiary of South Korea’s powerful Hanwha chaebol. Hanwha is already involved in the development of autonomous weapons such as the SGR-A1 sentry gun, which has reportedly been deployed on the border between North and South Korea. The company also builds cluster munitions, banned by international treaty (although many nations abstain from this ban, including South Korea, the US, Russia, and China).
Although a boycott against KAIST is significant, some experts say the campaign to control the development of autonomous weaponry is futile.
Previously, leaders in AI and robotics have written to the UN arguing that weapons that kill without human intervention could destabilize the world and should be controlled by international treaty. This has received some international support, with 19 countries including Egypt, Argentina, and Pakistan, backing such an initiative. But other countries like the US and UK say such legislation would be impractical, because of the impossibility of defining what does and does not constitute human control. Many systems already have at least some autonomous capabilities, including drones and missile defense networks.
For Walsh and others, though, the danger is too great to be complacent. “If developed, autonomous weapons will […] permit war to be fought faster and at a scale greater than ever before,” said Walsh in a press statement. “This Pandora’s box will be hard to close if it is opened.“