立即打开
杀手机器人何时才能被叫停

杀手机器人何时才能被叫停

Jeremy Khan 2021-12-27
杀手机器人正在得到广泛使用,留给世界的时间已经不多了。

出台杀手机器人禁令的一个“历史性机遇”曾摆在我们眼前,我们却未能把握住。

不久前,联合国裁军委员会(United Nations disarmament committee)释放了一则令人担忧的新闻。该委员会耗费了8年的时间一直在争论一个问题:面对人工智能武器(使用人工智能定位、跟踪、攻击和击杀目标,且无需人工干预)的迅速发展,我们应采取什么措施或是否要采取措施。很多国家希望禁止此类武器的开发,但联合国小组采取的是协商一致的运转模式,而且包括美国、俄罗斯、英国、印度和以色列在内的多个国家反对出台任何具有法律约束力的限制政策。

最终,委员会可能只能达成继续保持对话的意见,而且并未针对其未来对话设立明确的目标。这个不断拖延的结果被美国驻该委员会代表称之为能够覆盖所有国家的“理想强制令”,因为它涵盖了所有可能的成果。

其他很多国家和激进人士对这一结果有着完全不同的看法,称这一应对举措只能算是聊胜于无。

英国谢菲尔德大学(The University of Sheffield)机器人学和人工智能荣誉教授诺埃尔·沙基表示:“这是一次彻底的失败,完全是场灾难。”诺埃尔·沙基是“杀手机器人禁令”运动的发言人之一。

专注于这一活动的国际特赦组织(Amnesty International)高级顾问威瑞提·科伊勒在谈及此次联合国委员会会议时说:“这是一次大翻车。”

不再是科幻小说

人工智能控制的武器曾几何时只是科幻小说的内容,而且在联合国委员会2014年开始讨论无人武器时基本上也是如此。然而,无需人类管控便可选择目标的实战武器系统如今已开始部署于全球各个战场。那些认为这类武器会造成巨大威胁的民间团体和科学家将其称为“杀手机器人”。这类武器更准确的技术称谓是“致命性自主武器”,英文缩写LAWS。

今年早些时候,一篇介绍利比亚内战的联合国文章称,由土耳其一家公司生产的自主性武器Kargu-2四轴飞行器有可能被用于在一次战斗中跟踪和锁定逃跑的士兵。有报道称,类似的自主性武器被阿塞拜疆人用于其最近与亚美尼亚的战争中,同时,以色列也在其最近与哈马斯的交战中使用了自主性枪械。

上述事态导致众多人开始担心,从阻止或减缓这些武器的广泛使用看来,留给世界的时间已经不多了。杀手机器人禁令运动的另一位发言人克莱尔·康伯伊说:“毋庸置疑,该技术的发展速度正在超过外交对话的速度。”

一线希望

就在运动人士对本月联合国委员会会议感到异常失望时,一些人称其失败可能并不一定就是坏事,而且会给限制其发展的有效国际行动提供最佳机会。这是因为,此次失利有望让有关限制致命性自主武器国际对话的外交阵地发生转移,而在转移后的阵地中,少数国家的阻挠也将是徒劳的。

科伊勒表示:“对于那些呼吁共同创建法律约束工具、思考下一步该怎么做的国家来说,我认为当前是一个令人振奋的时刻。支持出台禁令的60多个国家应就启动替代流程做出决定,在这一流程中,协商一致原则将无法阻碍大多数国家的意愿。”

科伊勒称,尽管联合国委员会的对话未能达成一致意见,但至少帮助提升了人们对自主性武器威胁的认识。越来越多的国家开始支持禁令,包括新西兰、德国、奥地利、挪威、荷兰、巴基斯坦、中国以及构成非洲联盟的55个国家。

此外,数千名计算机科学家和人工智能研究人员已经签署了请愿书,呼吁禁止自主性武器,并承诺不从事开发此类武器的工作。科伊勒说,如今,我们有必要把握这一态势,并通过另一个论坛来推动禁令。

科伊勒称,将对话移出联合国《特定常规武器公约》(CCW,十年中大部分时间一直在讨论致命性自主武器法规的联合国框架)的优势在于,执法机构使用这些武器以及在内战中使用这些武器超出了联合国委员会强制令的管辖范畴。包括国际特赦组织和人权观察组织(Human Rights Watch)在内的很多民间团体对这些杀人机器人的潜在用途感到异常担忧。

运动人士称,《特定常规武器公约》也并非不是不可替代。科伊勒称,八个工业化国家可能会创建一个深入探讨这一问题的论坛。另一个方法是,当前支持禁令的其中一个国家可尝试通过联合国大会来推动这一事项。

科伊勒和沙基均提及催生了禁止使用杀伤性地雷国际公约的流程,以及禁止使用集束炸弹的类似公约,这些都可以作为一种潜在的模式,也就是在联合国正式流程之外达成具有约束力的国际条约。

在上述这两个案例中,单一国家(禁雷公约的背后是加拿大,禁用集束炸弹公约背后是挪威)同意举办国际磋商会议。在一些运动人士眼中,有一个国家有望承担推动致命性自主武器禁令的职责,那就是新西兰。该国政府在11月曾举全国之力,在推动禁令出台方面发挥了“领头羊作用”。其他国家包括挪威、德国和荷兰,这些国家的政府在过去数个月中也都做出了类似的声明。

沙基称,举办类似的谈判需要一个国家庞大的财力支持,数额可能高达数千万美元,这也是为什么像非洲联盟这样支持禁令的团体不大可能愿意举办国际磋商流程的原因之一。

这一方式的另一个缺点在于,无论达成什么条约,其约束力将仅限于那些缔约国,而不是覆盖整个联合国成员国或《日内瓦公约》(Geneva Convention)的签约国。例如,美国便从未签署过禁雷和禁用集束炸弹公约。

然而这一方式的赞同者称,这些条约会树立一项国际准则,并对那些拒绝签署条约的国家带来强大的道德压力。禁雷公约的出台迫使众多武器生产商停止生产这种武器,而且美国政府在2014年承诺不在朝鲜半岛之外使用这一武器。不过,美国军事参谋称,地雷对于韩国防御朝鲜可能的入侵十分必要。

“屠杀机器人”

加州大学洛杉矶分校(University of California, Los Angeles)国际关系教授罗伯特·特拉格称,并非所有人都认为《特定常规武器公约》之外的流程可在此时奏效。其中一个原因在于,对于那些担心某个特定敌人会采用自主性武器的国家来说,它们不大可能会同意单方面放弃自身拥有这一能力所能带来的威慑作用。罗伯特以人工智能治理中心代表的身份,参加了上周的日内瓦对话。

特拉格还提到,以禁雷和禁用集束炸弹公约为例,这些科技的使用——以及,更为重要的是,其局限性——在这些条约签署之时早已被各国所熟知,因此各国对此带来的得失了然于胸。他说,致命性自主武器的情况并不是这样,它刚刚才面世,甚至基本上并未得到部署。

麻省理工学院(MIT)物理学教授、未来生命研究所创始人马克斯·特格马克与加州大学伯克利分校(University of California at Berkeley)人工智能研究员斯图尔特·拉塞尔共同提出,一些当前阻碍出台杀人机器人约束性限制的国家有望转而支持禁止不超过一定尺寸或重量门槛的自主性武器,且这些武器的主要攻击对象是个人。

特格马克称,这些“屠杀机器人”可以是能够进行群体攻击的小型无人机,最终可能成为“穷人的大规模杀伤性武器”。恐怖分子或犯罪分子可用其进行大规模谋杀或刺杀个人,例如法官或政客。他说,这一点对于现有的国际秩序来说是一个非常不稳定的因素,因此包括美国和俄罗斯在内的现有大国应赞成禁止或限制屠杀机器人的推广。

他表示,由于这些小型自主性武器可通过大型系统进行整合后对船只、飞机、坦克或建筑进行攻击,因此禁令变得更加难以推动。他说,现有大国不大愿意放弃这些武器。

沙基称,特格马克和拉塞尔的看法代表的是少数人的观点,而且并非是“杀手机器人禁令运动”的主张。他说,与可以针对个人的小型武器一样,民间团体对于大型自主性武器亦十分担忧,例如可以从美国起飞、跨洋实施目标轰炸的人工智能无人机可能会在这一过程中误杀平民。

运动人士还称,尽管一些国家在口头上支持出台限制自主性武器的约束性法律协议,但他们对其真实想法表示担忧。在《特定常规武器公约》会议上,巴基斯坦是约束性协议的首要推动国,但其提出的主张是不应出台覆盖所有国家的禁令,因为该国需要使用此类武器来对抗印度开发的类似武器。(财富中文网)

译者:冯丰

审校:夏林

出台杀手机器人禁令的一个“历史性机遇”曾摆在我们眼前,我们却未能把握住。

不久前,联合国裁军委员会(United Nations disarmament committee)释放了一则令人担忧的新闻。该委员会耗费了8年的时间一直在争论一个问题:面对人工智能武器(使用人工智能定位、跟踪、攻击和击杀目标,且无需人工干预)的迅速发展,我们应采取什么措施或是否要采取措施。很多国家希望禁止此类武器的开发,但联合国小组采取的是协商一致的运转模式,而且包括美国、俄罗斯、英国、印度和以色列在内的多个国家反对出台任何具有法律约束力的限制政策。

最终,委员会可能只能达成继续保持对话的意见,而且并未针对其未来对话设立明确的目标。这个不断拖延的结果被美国驻该委员会代表称之为能够覆盖所有国家的“理想强制令”,因为它涵盖了所有可能的成果。

其他很多国家和激进人士对这一结果有着完全不同的看法,称这一应对举措只能算是聊胜于无。

英国谢菲尔德大学(The University of Sheffield)机器人学和人工智能荣誉教授诺埃尔·沙基表示:“这是一次彻底的失败,完全是场灾难。”诺埃尔·沙基是“杀手机器人禁令”运动的发言人之一。

专注于这一活动的国际特赦组织(Amnesty International)高级顾问威瑞提·科伊勒在谈及此次联合国委员会会议时说:“这是一次大翻车。”

不再是科幻小说

人工智能控制的武器曾几何时只是科幻小说的内容,而且在联合国委员会2014年开始讨论无人武器时基本上也是如此。然而,无需人类管控便可选择目标的实战武器系统如今已开始部署于全球各个战场。那些认为这类武器会造成巨大威胁的民间团体和科学家将其称为“杀手机器人”。这类武器更准确的技术称谓是“致命性自主武器”,英文缩写LAWS。

今年早些时候,一篇介绍利比亚内战的联合国文章称,由土耳其一家公司生产的自主性武器Kargu-2四轴飞行器有可能被用于在一次战斗中跟踪和锁定逃跑的士兵。有报道称,类似的自主性武器被阿塞拜疆人用于其最近与亚美尼亚的战争中,同时,以色列也在其最近与哈马斯的交战中使用了自主性枪械。

上述事态导致众多人开始担心,从阻止或减缓这些武器的广泛使用看来,留给世界的时间已经不多了。杀手机器人禁令运动的另一位发言人克莱尔·康伯伊说:“毋庸置疑,该技术的发展速度正在超过外交对话的速度。”

一线希望

就在运动人士对本月联合国委员会会议感到异常失望时,一些人称其失败可能并不一定就是坏事,而且会给限制其发展的有效国际行动提供最佳机会。这是因为,此次失利有望让有关限制致命性自主武器国际对话的外交阵地发生转移,而在转移后的阵地中,少数国家的阻挠也将是徒劳的。

科伊勒表示:“对于那些呼吁共同创建法律约束工具、思考下一步该怎么做的国家来说,我认为当前是一个令人振奋的时刻。支持出台禁令的60多个国家应就启动替代流程做出决定,在这一流程中,协商一致原则将无法阻碍大多数国家的意愿。”

科伊勒称,尽管联合国委员会的对话未能达成一致意见,但至少帮助提升了人们对自主性武器威胁的认识。越来越多的国家开始支持禁令,包括新西兰、德国、奥地利、挪威、荷兰、巴基斯坦、中国以及构成非洲联盟的55个国家。

此外,数千名计算机科学家和人工智能研究人员已经签署了请愿书,呼吁禁止自主性武器,并承诺不从事开发此类武器的工作。科伊勒说,如今,我们有必要把握这一态势,并通过另一个论坛来推动禁令。

科伊勒称,将对话移出联合国《特定常规武器公约》(CCW,十年中大部分时间一直在讨论致命性自主武器法规的联合国框架)的优势在于,执法机构使用这些武器以及在内战中使用这些武器超出了联合国委员会强制令的管辖范畴。包括国际特赦组织和人权观察组织(Human Rights Watch)在内的很多民间团体对这些杀人机器人的潜在用途感到异常担忧。

运动人士称,《特定常规武器公约》也并非不是不可替代。科伊勒称,八个工业化国家可能会创建一个深入探讨这一问题的论坛。另一个方法是,当前支持禁令的其中一个国家可尝试通过联合国大会来推动这一事项。

科伊勒和沙基均提及催生了禁止使用杀伤性地雷国际公约的流程,以及禁止使用集束炸弹的类似公约,这些都可以作为一种潜在的模式,也就是在联合国正式流程之外达成具有约束力的国际条约。

在上述这两个案例中,单一国家(禁雷公约的背后是加拿大,禁用集束炸弹公约背后是挪威)同意举办国际磋商会议。在一些运动人士眼中,有一个国家有望承担推动致命性自主武器禁令的职责,那就是新西兰。该国政府在11月曾举全国之力,在推动禁令出台方面发挥了“领头羊作用”。其他国家包括挪威、德国和荷兰,这些国家的政府在过去数个月中也都做出了类似的声明。

沙基称,举办类似的谈判需要一个国家庞大的财力支持,数额可能高达数千万美元,这也是为什么像非洲联盟这样支持禁令的团体不大可能愿意举办国际磋商流程的原因之一。

这一方式的另一个缺点在于,无论达成什么条约,其约束力将仅限于那些缔约国,而不是覆盖整个联合国成员国或《日内瓦公约》(Geneva Convention)的签约国。例如,美国便从未签署过禁雷和禁用集束炸弹公约。

然而这一方式的赞同者称,这些条约会树立一项国际准则,并对那些拒绝签署条约的国家带来强大的道德压力。禁雷公约的出台迫使众多武器生产商停止生产这种武器,而且美国政府在2014年承诺不在朝鲜半岛之外使用这一武器。不过,美国军事参谋称,地雷对于韩国防御朝鲜可能的入侵十分必要。

“屠杀机器人”

加州大学洛杉矶分校(University of California, Los Angeles)国际关系教授罗伯特·特拉格称,并非所有人都认为《特定常规武器公约》之外的流程可在此时奏效。其中一个原因在于,对于那些担心某个特定敌人会采用自主性武器的国家来说,它们不大可能会同意单方面放弃自身拥有这一能力所能带来的威慑作用。罗伯特以人工智能治理中心代表的身份,参加了上周的日内瓦对话。

特拉格还提到,以禁雷和禁用集束炸弹公约为例,这些科技的使用——以及,更为重要的是,其局限性——在这些条约签署之时早已被各国所熟知,因此各国对此带来的得失了然于胸。他说,致命性自主武器的情况并不是这样,它刚刚才面世,甚至基本上并未得到部署。

麻省理工学院(MIT)物理学教授、未来生命研究所创始人马克斯·特格马克与加州大学伯克利分校(University of California at Berkeley)人工智能研究员斯图尔特·拉塞尔共同提出,一些当前阻碍出台杀人机器人约束性限制的国家有望转而支持禁止不超过一定尺寸或重量门槛的自主性武器,且这些武器的主要攻击对象是个人。

特格马克称,这些“屠杀机器人”可以是能够进行群体攻击的小型无人机,最终可能成为“穷人的大规模杀伤性武器”。恐怖分子或犯罪分子可用其进行大规模谋杀或刺杀个人,例如法官或政客。他说,这一点对于现有的国际秩序来说是一个非常不稳定的因素,因此包括美国和俄罗斯在内的现有大国应赞成禁止或限制屠杀机器人的推广。

他表示,由于这些小型自主性武器可通过大型系统进行整合后对船只、飞机、坦克或建筑进行攻击,因此禁令变得更加难以推动。他说,现有大国不大愿意放弃这些武器。

沙基称,特格马克和拉塞尔的看法代表的是少数人的观点,而且并非是“杀手机器人禁令运动”的主张。他说,与可以针对个人的小型武器一样,民间团体对于大型自主性武器亦十分担忧,例如可以从美国起飞、跨洋实施目标轰炸的人工智能无人机可能会在这一过程中误杀平民。

运动人士还称,尽管一些国家在口头上支持出台限制自主性武器的约束性法律协议,但他们对其真实想法表示担忧。在《特定常规武器公约》会议上,巴基斯坦是约束性协议的首要推动国,但其提出的主张是不应出台覆盖所有国家的禁令,因为该国需要使用此类武器来对抗印度开发的类似武器。(财富中文网)

译者:冯丰

审校:夏林

It was billed as “a historic opportunity” to stop killer robots. It failed.

That was the alarming news out of a United Nations disarmament committee held in Geneva at the end of last week. The committee had spent eight years debating what, if anything, to do about the rapid development of weapons that use artificial intelligence to locate, track, attack, and kill targets without human intervention. Many countries want to see such weapons banned, but the UN group operates by consensus and several states, including the U.S., Russia, the United Kingdom, India, and Israel, were opposed to any legally binding restrictions.

In the end, the committee could agree only to keep talking, with no clear objective for their future discussions. This kicking-of-the-can was an outcome that the U.S. representative to the UN committee called “a dream mandate” for all countries because it did not foreclose any particular outcome.

Many other nations and activists saw that outcome as something quite different—a woefully inadequate response.

“It was a complete failure, a disaster really,” said Noel Sharkey, an emeritus professor of robotics and A.I. at the University of Sheffield, in the U.K., and one of several spokespersons for the Stop Killer Robots campaign.

“It was a car crash,” Verity Coyle, a senior adviser to Amnesty International focused on its campaign for a ban of the weapons, said of the UN committee’s meeting.

Not science fiction any more

A. I.-guided weapons were once the stuff of science fiction—and were still largely in that realm when the UN committee first began talking about autonomous weapons in 2014. But real systems with the ability to select targets without human oversight are now starting to be deployed on battlefields around the globe. Civil society groups and scientists who are convinced that they pose a grave danger have dubbed them “killer robots.” Their more technical moniker is lethal autonomous weapons, or LAWS.

A UN report on the Libyan civil war said earlier this year that an autonomous weapon, the Kargu-2 quadcopter, produced by a company in Turkey, was likely used to track and target fleeing fighters in one engagement. There have been reports of similar autonomous munitions being used by Azerbaijan in its recent war with Armenia, as well as autonomous weapons guns being deployed by Israel in its most recent conflict with Hamas.

These developments have led many to fear that the world is running out of time to take action to stop or slow the widespread use of these weapons. “The pace of technology is really beginning to outpace the rate of diplomatic talks," said Clare Conboy, another spokesperson for the Stop Killer Robots campaign.

Silver lining

While campaigners were bitterly disappointed with results of this month’s meetings of the UN committee, some say its failure may counterintuitively present the best opportunity in years for effective international action to restrict their development. That’s because it provides an opportunity to move the discussion of an international treaty limiting LAWS to a different diplomatic venue where a handful of states won’t be able to thwart progress.

“I think it is an exciting moment for those states calling for a legally binding instrument to come together and think about what is the best next step,” Coyle said. “The 60-odd countries that are in favor of a ban need to take a decision on starting a parallel process, somewhere where consensus rules could not be used to block the will of the majority.”

Coyle said that discussions at the UN committee, although failing to reach an agreement, had at least helped to raise awareness of the danger posed by autonomous weapons. A growing number of nations have come out in favor of a ban, including New Zealand, Germany, Austria, Norway, the Netherlands, Pakistan, China, Spain, and the 55 countries that make up the African Union.

In addition, thousands of computer scientists and artificial intelligence researchers have signed petitions calling for a ban on autonomous weapons and pledged not to work on developing them. Now, Coyle said, it was important to take that momentum and use it in another forum to push for a ban.

Coyle said that an advantage of taking the discussion outside the UN’s Convention on Certain Conventional Weapons, or CCW—the UN committee that has been discussing LAWS regulation for the better part of a decade—is that the use of these weapons by law enforcement agencies and in civil wars is outside the scope of that UN committee’s mandate. Those potential uses of killer robots are of grave concern to many civil society groups, including Amnesty International and Human Rights Watch.

Campaigners say there are several possible alternatives to the CCW. Coyle said that the Group of Eight industrialized nations might become a forum for further discussion. Another option would be for one of the states currently in favor of a ban to try to push something through the UN General Assembly.

Coyle and Sharkey also both pointed to the process that led to the international treaty prohibiting the use of anti-personnel land mines, and a similar convention barring the use of cluster munitions, as potential models for how to achieve a binding international treaty outside the formal UN process.

In both of those examples, a single nation—Canada for land mines, Norway for cluster munitions—agreed to host international negotiations. Among the countries that some campaigners think might be persuaded to take on that role for LAWS are New Zealand, whose government in November committed the country to playing “a leadership role” in pushing for a ban. Others include Norway, Germany, and the Netherlands, whose governments have all made similar statements over the past several months.

Hosting such negotiations requires a large financial commitment from one country, running into perhaps millions of dollars, Sharkey said, which is one reason that the African Union, for instance, which has come out in favor of a ban, is unlikely to volunteer to host an international negotiation process.

Another drawback of this approach is that whatever treaty is developed would be binding only on those countries that choose to sign it, rather than something that might cover all UN members or all signatories to the Geneva Convention. The U.S., for instance, has not acceded to either the land mine or the cluster munitions treaties.

But advocates of this approach note that these treaties establish an international norm and exert a high degree of moral pressure even on those countries that decline to sign them. The land mine treaty resulted in many arms makers ceasing production of the weapons, and the U.S. government in 2014 promised not to use the munitions outside of the Korean Peninsula, where American military planners have argued land mines are essential to the defense of South Korea against a possible invasion by North Korea.

“Slaughterbots”

Not everyone is convinced a process outside the CCW will work this time around. For one thing, countries that fear a particular adversary will acquire LAWS are unlikely to agree to unilaterally abandon the deterrent of having that capability themselves, said Robert Trager, a professor of international relations at the University of California, Los Angeles, who attended last week’s Geneva discussions as a representative of the Center for the Governance of AI.

Trager also noted that in the case of land mines and cluster munitions, the use—and, critically, the limitations—of those technologies were well established at the time treaties were negotiated. Countries understood exactly what they were giving up. That is not the case with LAWS, which are only just being developed and have barely ever been deployed, he said.

Max Tegmark, a physics professor at MIT and cofounder of the Future of Life Institute, which seeks to address “existential risks” to humanity, and Stuart Russell, an A.I. researcher at the University of California at Berkeley, have proposed that some countries currently standing in the way of binding restrictions on killer robots might be persuaded to support a ban on autonomous weapons below a certain size or weight threshold that are designed to primarily target individual people.

Tegmark said that these “slaughterbots,” which might be small drones that could attack in swarms, would essentially represent “a poor man’s weapon of mass destruction.” They could be deployed by terrorists or criminals to either commit mass murder or assassinate individuals, such as judges or politicians. This would be highly destabilizing to the existing international order and so existing powers, such as the U.S. and Russia, ought to be in favor of banning or restricting the proliferation of slaughterbots, he said.

Progress toward a ban had been made more difficult by the conflation of these small autonomous weapons with larger systems designed to attack ships, aircraft, tanks, or buildings. Established powers would be more hesitant to give up these weapons, he said.

Sharkey said that Tegmark’s and Russell’s position represented a fringe view and was not the position of the Campaign to Stop Killer Robots. He said civil society groups were just as concerned about larger autonomous weapons, such as A.I.-piloted drones that could take off from the U.S. and fly across oceans to bomb targets, possibly killing civilians in the process, as they were about smaller systems that could target individual people.

Campaigners also say they worry about the sincerity of some of the countries that have said they support a binding legal agreement restricting LAWS. Pakistan has been a leading proponent of a binding agreement at the CCW, but has taken the position that without a ban that would cover all countries, it needs such weapons to counter India’s development of similar systems.

热读文章
热门视频
扫描二维码下载财富APP
<thead id='QLKN'><i></i></thead>
      <big id='sRCAsi'><em></em></big>
        <samp id='ISNhnvQK'><thead></thead></samp>
          <bgsound id='Kl'><abbr></abbr></bgsound>