课程报名咨询电话:010-51268840 51268841
英语 | 小语种 | 考研 | 在职研 | 财会 | 公务员 | 人力资源 | 出国留学 | 冬令营 | 企业管理 | 高校 | 高考 | 文体 | 0-18岁 | 网络课堂
 外语招生网
 外语报名咨询热线:010-51294614、51299614  ||  热点:环球雅思部分课程9.5折特惠 限时抢报!
 雅思·IELTS新托福·TOEFL四六级PETS商务英语职称英语小语种翻译少儿英语GREGMAT | 其他外语考试

The age of the killer robot is no longer a sci-fi fantasy

作者:不详   发布时间:2010-01-28 09:52:01  来源:网络
  • 文章正文
  • 调查
  • 热评
  • 论坛

  In the dark, in the silence, in a blink, the age of the autonomous killer robot has arrived. It is happening. They are deployed. And – at their current rate of acceleration – they will become the dominant method of war for rich countries in the 21st century. These facts sound, at first, preposterous. The idea of machines that are designed to whirr out into the world and make their own decisions to kill is an old sci-fi fantasy: picture a mechanical Arnold Schwarzenegger blasting a truck and muttering: "Hasta la vista, baby." But we live in a world of such whooshing technological transformation that the concept has leaped in just five years from the cinema screen to the battlefield – with barely anyone back home noticing.
  When the US invaded Iraq in 2003, they had no robots as part of their force. By the end of 2005, they had 2,400. Today, they have 12,000, carrying out 33,000 missions a year. A report by the US Joint Forces Command says autonomous robots will be the norm on the battlefield within 20 years.
  The Nato forces now depend on a range of killer robots, largely designed by the British Ministry of Defence labs privatised by Tony Blair in 2001. Every time you hear about a "drone attack" against Afghanistan or Pakistan, that's an unmanned robot dropping bombs on human beings. Push a button and it flies away, kills, and comes home. Its robot-cousin on the battlefields below is called SWORDS: a human-sized robot that can see 360 degrees around it and fire its machine-guns at any target it "chooses". Fox News proudly calls it "the GI of the 21st century." And billions are being spent on the next generation of warbots, which will leave these models looking like the bulky box on which you used to play Pong.
  At the moment, most are controlled by a soldier – often 7,500 miles away – with a control panel. But insurgents are always inventing new ways to block the signal from the control centre, which causes the robot to shut down and "die". So the military is building "autonomy" into the robots: if they lose contact, they start to make their own decisions, in line with a pre-determined code.
  This is "one of the most fundamental changes in the history of human warfare," according to PW Singer, a former analyst for the Pentagon and the CIA, in his must-read book, Wired For War: The Robotics Revolution and Defence in the Twenty-First Century. Humans have been developing weapons that enabled us to kill at ever-greater distances and in ever-greater numbers for millennia, from the longbow to the cannon to the machine-gun to the nuclear bomb. But these robots mark a different stage.
  The earlier technologies made it possible for humans to decide to kill in more "sophisticated" ways – but once you programme and unleash an autonomous robot, the war isn't fought by you any more: it's fought by the machine. The subject of warfare shifts.
  The military claim this is a safer model of combat. Gordon Johnson of the Pentagon's Joint Forces Command says of the warbots: "They're not afraid. They don't forget their orders. They don't care if the guy next to them has been shot. Will they do a better job than humans? Yes." Why take a risk with your soldier's life, if he can stay in Arlington and kill in Kandahar? Think of it as War 4.0.
  But the evidence punctures this techno-optimism. We know the programming of robots will regularly go wrong – because all technological programming regularly goes wrong. Look at the place where robots are used most frequently today: factories. Some 4 per cent of US factories have "major robotics accidents" every year – a man having molten aluminium poured over him, or a woman picked up and placed on a conveyor belt to be smashed into the shape of a car. The former Japanese Prime Minister Junichiro Koizumi was nearly killed a few years ago after a robot attacked him on a tour of a factory. And remember: these are robots that aren't designed to kill.
  Think about how maddening it is to deal with a robot on the telephone when you want to pay your phone bill. Now imagine that robot had a machine-gun pointed at your chest.
  Robots find it almost impossible to distinguish an apple from a tomato: how will they distinguish a combatant from a civilian? You can't appeal to a robot for mercy; you can't activate its empathy. And afterwards, who do you punish? Marc Garlasco, of Human Rights Watch, says: "War crimes need a violation and an intent. A machine has no capacity to want to kill civilians.... If they are incapable of intent, are they incapable of war crimes?"
  Robots do make war much easier – for the aggressor. You are taking much less physical risk with your people, even as you kill more of theirs. One US report recently claimed they will turn war into "an essentially frictionless engineering exercise". As Larry Korb, Ronald Reagan's assistant secretary of defence, put it: "It will make people think, 'Gee, warfare is easy.'"
  If virtually no American forces had died in Vietnam, would the war have stopped when it did – or would the systematic slaughter of the Vietnamese people have continued for many more years? If "we" weren't losing anyone in Afghanistan or Iraq, would the call for an end to the killing be as loud? I'd like to think we are motivated primarily by compassion for civilians on the other side, but I doubt it. Take "us" safely out of the picture and we will be more willing to kill "them".
  There is some evidence that warbots will also make us less inhibited in our killing. When another human being is standing in front of you, when you can stare into their eyes, it's hard to kill them. When they are half the world away and little more than an avatar, it's easy. A young air force lieutenant who fought through a warbot told Singer: "It's like a video game [with] the ability to kill. It's like ... freaking cool."
  When the US First Marine Expeditionary Force in Iraq was asked in 2006 what kind of robotic support it needed, they said they had an "urgent operational need" for a laser mounted on to an unmanned drone that could cause "instantaneous burst-combustion of insurgent clothing, a rapid death through violent trauma, and more probably a morbid combination of both". The request said it should be like "long-range blow torches or precision flame-throwers". They wanted to do with robots things they would find almost unthinkable face-to-face.
  While "we" will lose fewer people at first by fighting with warbots, this way of fighting may well catalyse greater attacks on us in the long run. US army staff sergeant Scott Smith boasts warbots create "an almost helpless feeling.... It's total shock and awe." But while terror makes some people shut up, it makes many more furious and determined to strike back.
  Imagine if the beaches at Dover and the skies over Westminster were filled with robots controlled from Torah Borah, or Beijing, and could shoot us at any time. Some would scuttle away – and many would be determined to kill "their" people in revenge. The Lebanese editor Rami Khouri says that when Lebanon was bombarded by largely unmanned Israeli drones in 2006, it only "enhanced the spirit of defiance" and made more people back Hezbollah.
  Is this a rational way to harness our genius for science and spend tens of billions of pounds? The scientists who were essential to developing the nuclear bomb – including Albert Einstein, Robert Oppenheimer, and Andrei Sakharov – turned on their own creations in horror and begged for them to be outlawed. Some distinguished robotics scientists, like Illah Nourbakhsh, are getting in early, and saying the development of autonomous military robots should be outlawed now.
  There are some technologies that are so abhorrent to human beings that we forbid them outright. We have banned war-lasers that permanently blind people along with poison gas. The conveyor belt dragging us ever closer to a world of robot wars can be stopped – if we choose to.
  All this money and all this effort can be directed towards saving life, not ever-madder ways of taking it. But we have to decide to do it. We have to make the choice to look the warbot in the eye and say, firmly and forever, "Hasta la vista, baby."

以下网友留言只代表网友个人观点,不代表本站观点。 立即发表评论
提交评论后,请及时刷新页面!               [回复本贴]    
用户名: 密码:
验证码: 匿名发表
外语招生最新热贴:
【责任编辑:苏婧  纠错
阅读下一篇:下面没有链接了
【育路网版权与免责声明】  
    ① 凡本网注明稿件来源为"原创"的所有文字、图片和音视频稿件,版权均属本网所有。任何媒体、网站或个人转载、链接、转贴或以其他方式复制发表时必须注明"稿件来源:育路网",违者本网将依法追究责任;
    ② 本网部分稿件来源于网络,任何单位或个人认为育路网发布的内容可能涉嫌侵犯其合法权益,应该及时向育路网书面反馈,并提供身份证明、权属证明及详细侵权情况证明,育路网在收到上述法律文件后,将会尽快移除被控侵权内容。
外语报名咨询电话:010-51294614、51299614
外语课程分类
 
-- 大学英语---
专四专八英语四六级公共英语考研英语
-- 出国考试---
雅思托福GREGMAT
-- 职业英语---
BEC翻译职称英语金融英语托业
博思实用商务面试英语
-- 实用英语---
口语新概念外语沙龙口语梦工场口语
VIP翻译
-- 小语种----
日语法语德语韩语俄语阿拉伯语
西班牙语意大利语其它语种
热点专题·精品课程
 
外语课程搜索
课程关键词:
开课时间:
价格范围: 元 至
课程类别:
学员报名服务中心: 北京北三环西路32号恒润中心1803(交通位置图
咨询电话:北京- 010-51268840/41 传真:010-51418040 上海- 021-51567016/17
育路网-中国新锐教育社区: 北京站 | 上海站 | 郑州站| 天津站
本站法律顾问:邱清荣律师
北京育路互联科技有限公司版权所有1999-2010 | 京ICP备05012189号