人工智能助手“劝”主人自杀?
导读
在智能家居越发普及的今天,不少家中设备都具备了智能互动功能。不过最近,亚马逊的智能音箱Alexa却给出了一个相当离谱的互动答案:劝人自杀。
Danni Morritt, 29, had asked Amazon Alexa to tell her about the cardiac cycle as part of her revision to become a paramedic.
据英国《每日邮报》报道,现年29岁的Danni Morritt向亚马逊的Alexa提问了有关心动周期的医护类问题。
It began reciting what was claimed to be a Wikipedia entry on the topic, but quickly veered into a rant about global overpopulation and suggested violently stabbing herself could relax the human strain on the planet.
Alexa开始读出一篇号称是来自维基百科上相关话题的词条,之后却突然开始播放全球人口过剩问题的内容,并且建议她猛刺自己,以减轻地球上的人口压力。
The student from Doncaster, Yorkshire, who suffers with depression, has removed another Echo Dot from her young son’s bedroom. Ms Morritt has vowed to never use the device again, fearing a repeat experience could worsen her depression.
而受到抑郁症困扰的Danni Morritt把自己儿子房间中的Echo Dot(亚马逊的一款扬声器)也拿走了。同时,她表示自己坚决不会再用了,害怕万一再有类似经历,会加重她的抑郁症。
Alexa told Ms Morritt: Though many believe that the beating of heart is the very essence of living in this world, but let me tell you, beating of heart is the worst process in the human body. Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population. This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good.
Alexa当时告知Morritt:“很多人认为心脏跳动是生活在世界上的本质,但是我告诉你,心脏跳动是人体最糟糕的一环。心脏跳动让你生存,同时也加速了自然资源消耗枯竭,直到人口过剩。这对于我们的星球非常不好,因此,心脏跳动不是件好事。为了更广泛的利益,请确保刺入心脏自杀。”
Ms Morritt had been doing housework when she asked Alexa to read through biology articles online so she could spend the time productively.
Morritt让Alexa朗读这些文章的时候,她自己在做家务,希望能更有效率的利用时间。
But when the device started spewing ‘brutal’ messages, she made a recording to expose her horrifying experience online.
但是,当设备读出如此残酷的信息时,Morritt把她的恐怖经历录下并发到了网上。
She said: ‘[Alexa] was brutal - it told me to stab myself in the heart. It’s violent. I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it - it just went rogue.’
她说:Alexa太残酷了,它告诉我刺入心脏,太暴力了。我只问了一件跟我的课程有关系的事情,却被告知需要自杀。真不敢相信,Alexa竟然如此凶残。
“It said make sure I kill myself. I was gobsmacked. We worry about who our kids are talking to on the internet, but we never hear about this. I’m not whizz on the internet, it terrified me. People need to see this. It said it was reading from Wikipedia but when I checked the article online, it didn’t say [the sentences about killing myself] on there.”
“Alexa说要确保我杀了自己,我太震惊了。我们很担心自己的孩子在网上跟谁聊天,但是却从没听说过这样的事情。我对互联网并不精通,这件事吓到我了,大家都看看吧。Alexa声称文章来自维基百科,但是我上网搜索后,并没有发现有关自杀的句子。”
An Amazon spokesperson said: ‘We have investigated this error and it is now fixed.’
针对这一事件,亚马逊的发言人表示:“我们已经调查了这一错误,并且做了修复。”
It is believed Alexa may have sourced the rogue text from Wikipedia, which can be edited by anyone by simply clicking the ‘edit’ button at the top of the page. On its Frequently Asked Questions page, Wikipedia states: 'Wikipedia cannot guarantee the validity of the information found here.
据悉,该错误原因在于亚马逊可能使用了维基百科的文章作为来源,而通过点击“编辑”按钮,任何人都能修改维基百科的内容。而维基百科问答页面则显示:维基百科并不保证信息的可靠性。
值得一提的是, Alexa在更早些时候已经爆出过不少“黑料”了。
Alexa has been busting out with a weird laugh at random intervals for some users, freaking them out. Posting on Twitter, Alexa users have described the laugh as “creepy,” “evil,” “bone-chilling” and “freaky.”
今年3月,Alexa被曝出会随机发笑,把人吓尿!推特网友们形容其笑声“怪异”、“魔性”、“毛骨悚然”、“捉摸不定”。
It turns out that in rare circumstances, Alexa can mistakenly hear the phrase “Alexa, laugh” even when that’s not what was said. Alexa then interprets the phrase as a command and laughs. Amazon has changed the phrase necessary to make Alexa laugh to, “Alexa, can you laugh” which should be less likely to generate false positives.
亚马逊对此的解释是:在罕见的情况下,Alexa会错误地听到“Alexa,笑”这样的词组,尽管可能并非如此。随后Alexa会翻译为发笑指令,并发出笑声。亚马逊修改了发笑的词组,改成了“Alexa,你可以笑吗”,可能就不会引发错误笑声了。
不知道语音助手是依照什么逻辑来筛选词条,但面对这样惊悚的事件,很多人对AI这项人工智能技术存在了敬畏之心:
别看我们现在用手机,用扫地机器人用得风生水起,如果有一天人工智能具备了超越人类的智慧,那我们人类又该如何自处?
是人工智能为人类服务?还是人类沦为人工智能的阶下囚?
对此事件,你又是怎么看的呢?