声明: 本站全部内容源自互联网,不进行任何盈利行为

仅做 整合 / 美化 处理

首页: https://dream-plan.cn

【TED】选择的困惑

 

So when people voice fears of artificial intelligence, 当人们谈论起对于 人工智能的恐惧时 very often, they invoke images of humanoid robots run amok. 浮现在脑海里的 往往是失控的机器人 You know? Terminator? 就像终结者一样 You know, that might be something to consider, 这种担心固然有一定道理 but that's a distant threat. 但目前和我们相隔甚远 Or, we fret about digital surveillance 我们也会对数字监控心生恐惧 with metaphors from the past. 这从过去的隐喻中就可以初见端倪 "1984," George Orwell's "1984," 例如乔治·奥威尔的著作 1984 it's hitting the bestseller lists again. 最近再次登上热销榜 It's a great book, 这是一本很好的书 but it's not the correct dystopia for the 21st century. 但是书中的反乌托邦社会 并不是21世纪的正确缩影 What we need to fear most 我们最应该担心的 is not what artificial intelligence will do to us on its own, 并不是人工智能本身 对我们的影响 but how the people in power will use artificial intelligence 而是掌权的人会怎样 利用人工智能 to control us and to manipulate us 来控制并摆布我们 in novel, sometimes hidden, 通过新奇 有时是隐蔽的 subtle and unexpected ways. 微妙以及不可预料的手段 Much of the technology 很多对我们的 that threatens our freedom and our dignity in the near-term future 自由和尊严有潜在威胁的科技 is being developed by companies 正在被那些收集 in the business of capturing and selling our data and our attention 并贩卖我们的私人信息给广告商的 to advertisers and others: 公司开发出来 Facebook, Google, Amazon, 例如脸书 谷歌 亚马逊 Alibaba, Tencent. 以及阿里巴巴和腾讯 Now, artificial intelligence has started bolstering their business as well. 现在 人工智能也开始强化 他们自身的业务 And it may seem like artificial intelligence 看起来好像人工智能只不过 is just the next thing after online ads. 是网络广告的下一步 It's not. 但并非如此 It's a jump in category. 它是一个全新的类别 It's a whole different world, 是一个完全不同的世界 and it has great potential. 并且有着极高的潜力 It could accelerate our understanding of many areas of study and research. 它可以加快人们在很多 领域的学习与研究速度 But to paraphrase a famous Hollywood philosopher, 但就如好莱坞一名著名哲学家所言 "With prodigious potential comes prodigious risk." 惊人的潜力带来的是惊人的风险 Now let's look at a basic fact of our digital lives, online ads. 我们得明白关于数字生活 以及网络广告的基本事实 Right? We kind of dismiss them. 是吧 我们几乎把它们忽略了 They seem crude, ineffective. 尽管它们看起来很粗糙 没什么说服力 We've all had the experience of being followed on the web 我们都曾在上网时 被网上的一些广告追踪过 by an ad based on something we searched or read. 它们是根据我们的浏览历史生成的 You know, you look up a pair of boots 比如 你搜索了一双皮靴 and for a week, those boots are following you around everywhere you go. 接下来的一周里 这双皮靴就 在网上如影随形的跟着你 Even after you succumb and buy them, they're still following you around. 即使你屈服了 买下了它们 广告也不会消失 We're kind of inured to that kind of basic, cheap manipulation. 我们已经习惯了这种 廉价粗暴的操纵 We roll our eyes and we think, "You know what? These things don't work." 还不屑一顾的想着 这东西对我没用的 Except, online, 但是别忘了 在网上 the digital technologies are not just ads. 广告并不是数字科技的全部 Now, to understand that, let's think of a physical world example. 为了便于理解 我们举几个 现实世界的例子 You know how, at the checkout counters at supermarkets, near the cashier, 你知道为什么在超市收银台的旁边 there's candy and gum at the eye level of kids? 要放一些小孩子 一眼就能看到的糖果吗 That's designed to make them whine at their parents 那是为了让孩子在父母面前撒娇 just as the parents are about to sort of check out. 就当他们马上要结账的时候 Now, that's a persuasion architecture. 那是一种说服架构 It's not nice, but it kind of works. 并不完美 但很管用 That's why you see it in every supermarket. 这也是每家超市惯用的伎俩 Now, in the physical world, 在现实世界里 such persuasion architectures are kind of limited, 这种说服架构是有限制的 because you can only put so many things by the cashier. Right? 因为能放在收银台旁边的 东西是有限的 对吧 And the candy and gum, it's the same for everyone, 而且所有人看到的都是同样的糖果 even though it mostly works 所以说大多数情况下 only for people who have whiny little humans beside them. 只是针对那些带着小孩的买主 In the physical world, we live with those limitations. 这些是现实世界的种种局限 In the digital world, though, 但在网络世界里 persuasion architectures can be built at the scale of billions 说服架构可以千变万化 因人而异 and they can target, infer, understand 它们可以理解并推断个体用户的喜好 and be deployed at individuals 然后被部署在用户周围 one by one 一个接一个 by figuring out your weaknesses, 通过对每个人弱点的了解 and they can be sent to everyone's phone private screen, 出现在每个人的私人手机屏幕上 so it's not visible to us. 而其他人却看不见 And that's different. 这是(与物质世界)截然不同的地方 And that's just one of the basic things that artificial intelligence can do. 而这仅仅是人工智能的基本功能之一 Now, let's take an example. 再举个例子 Let's say you want to sell plane tickets to Vegas. Right? 假如你要销售飞往 拉斯维加斯的机票 So in the old world, you could think of some demographics to target 在过去 你也许需要一些 统计资料来确定销售对象 based on experience and what you can guess. 然后根据你的个人经验和判断 You might try to advertise to, oh, 你也许会把推广目标定为 men between the ages of 25 and 35, 25岁到35岁的男性 or people who have a high limit on their credit card, 或者是高信用卡额度人群 or retired couples. Right? 或者是退休夫妇 对吧 That's what you would do in the past. 那就是你以前采用的方法 With big data and machine learning, 但在大数据和人工智能面前 that's not how it works anymore. 一切都改变了 So to imagine that, 请想象一下 think of all the data that Facebook has on you: 你被Facebook掌握的所有信息 every status update you ever typed, 你的每一次状态更新 every Messenger conversation, 每一条对话内容 every place you logged in from, 所有的登陆地点 all your photographs that you uploaded there. 你上传的所有照片 If you start typing something and change your mind and delete it, 还有你输入了一部分 后来又删掉的内容 Facebook keeps those and analyzes them, too. Facebook也会保存下来进行分析 Increasingly, it tries to match you with your offline data. 它将越来越多的数据 和你的离线生活匹配 It also purchases a lot of data from data brokers. 还有从网络信息商贩那里购买信息 It could be everything from your financial records 从你的财务记录到 所有网页浏览记录 to a good chunk of your browsing history. 各类信息无所不包 Right? In the US, such data is routinely collected, 在美国 这种数据是经常被收集 collated and sold. 被整理 然后被贩卖的 In Europe, they have tougher rules. 而在欧洲 这是被明令禁止的 So what happens then is, 所以接下来会发生的是 by churning through all that data, these machine-learning algorithms -- 电脑通过算法分析 所有收集到的数据 that's why they're called learning algorithms -- 这个算法之所以叫做学习算法 they learn to understand the characteristics of people 因为它们能够学会分析所有之前买过 who purchased tickets to Vegas before. 去维加斯机票的人的性格特征 When they learn this from existing data, 而在学会分析已有数据的同时 they also learn how to apply this to new people. 它们也在学习如何将其 应用在新的人群中 So if they're presented with a new person, 如果有个新用户 they can classify whether that person is likely to buy a ticket to Vegas or not. 它们可以迅速判断这个人 会不会买去维加斯的机票 Fine. You're thinking, an offer to buy tickets to Vegas. 这倒还好 你也许会想 不就是一个卖机票的广告吗 I can ignore that. 我不理它不就行了 But the problem isn't that. 但问题不在这儿 The problem is, 真正的问题是 we no longer really understand how these complex algorithms work. 我们已经无法真正理解 这些复杂的算法究竟是怎样工作的了 We don't understand how they're doing this categorization. 我们不知道它们是 如何进行这种分类的 It's giant matrices, thousands of rows and columns, 那是庞大的数字矩阵 成千上万的行与列 maybe millions of rows and columns, 也许是数百万的行与列 and not the programmers 而没有程序员看管它们 and not anybody who looks at it, 没有任何人看管它们 even if you have all the data, 即使你拥有所有的数据 understands anymore how exactly it's operating 也完全了解算法是如何运行的 any more than you'd know what I was thinking right now 如果仅仅展示给你我的部分脑截面 if you were shown a cross section of my brain. 你也不可能知道我的想法 It's like we're not programming anymore, 就好像这已经不是我们在编程了 we're growing intelligence that we don't truly understand. 我们是在创造一种 我们并不了解的智能 And these things only work if there's an enormous amount of data, 这种智能只有在 庞大的数据支持下才能工作 so they also encourage deep surveillance on all of us 所以它们才致力于对我们 所有人进行强力监控 so that the machine learning algorithms can work. 以便学习算法的运行 That's why Facebook wants to collect all the data it can about you. 这就是Facebook费尽心思 收集用户信息的原因 The algorithms work better. 这样算法才能更好的运行 So let's push that Vegas example a bit. 我们再将那个维加斯的 例子强化一下 What if the system that we do not understand 如果那个我们并不了解的系统 was picking up that it's easier to sell Vegas tickets 发现即将进入躁狂 阶段的躁郁症患者 to people who are bipolar and about to enter the manic phase. 更有可能买去维加斯的机票 Such people tend to become overspenders, compulsive gamblers. 这是一群有挥霍金钱 以及好赌倾向的人 They could do this, and you'd have no clue that's what they were picking up on. 这些算法完全做得到 而你却对它们 是如何做到的毫不知情 I gave this example to a bunch of computer scientists once 我曾把这个例子举给 一些计算机科学家 and afterwards, one of them came up to me. 后来其中一个找到我 He was troubled and he said, "That's why I couldn't publish it." 他很烦恼 并对我说 这就是我没办法发表它的原因 I was like, "Couldn't publish what?" 我问 发表什么 He had tried to see whether you can indeed figure out the onset of mania 他曾尝试在狂躁症病人 被确诊具有某些医疗症状前 from social media posts before clinical symptoms, 是否可以从他们的社交媒体上 发现病情的端倪 and it had worked, 他做到了 and it had worked very well, 还做得相当不错 and he had no idea how it worked or what it was picking up on. 但他不明白这是怎么做到的 或者说如何算出来的 Now, the problem isn't solved if he doesn't publish it, 那么如果他不发表论文 这个问题就得不到解决 because there are already companies 因为早就有其他的一些公司 that are developing this kind of technology, 在发展这样的科技了 and a lot of the stuff is just off the shelf. 很多类似的东西现在就摆在货架上 This is not very difficult anymore. 这已经不是什么难事了 Do you ever go on YouTube meaning to watch one video 你是否曾经想在YouTube 上看一个视频 and an hour later you've watched 27? 结果不知不觉看了27个 You know how YouTube has this column on the right 你知不知道YouTube的 网页右边有一个边栏 that says, "Up next" 上面写着 即将播放 and it autoplays something? 然后它往往会自动播放一些东西 It's an algorithm 这就是算法 picking what it thinks that you might be interested in 算出你的兴趣点 and maybe not find on your own. 甚至连你自己都没想到 It's not a human editor. 这可不是人工编辑 It's what algorithms do. 这就是算法的本职工作 It picks up on what you have watched and what people like you have watched, 它选出你以及和你 相似的人看过的视频 and infers that that must be what you're interested in, 然后推断出你的大致兴趣圈 what you want more of, 推断出你想看什么 and just shows you more. 然后就那些东西展示给你 It sounds like a benign and useful feature, 听起来像是一个无害且贴心的功能 except when it isn't. 但有时候它并不是 So in 2016, I attended rallies of then-candidate Donald Trump 2016年 我参加了当时的总统 候选人 唐纳德 特朗普 的系列集会 to study as a scholar the movement supporting him. 以学者的身份研究 这个支持他的运动 I study social movements, so I was studying it, too. 我当时正好在研究社会运动 And then I wanted to write something about one of his rallies, 然后我想要写一些 有关其中一次集会的文章 so I watched it a few times on YouTube. 所以我在YouTube上看了几遍 这个集会的视频 YouTube started recommending to me 然后YouTube就开始 不断给我推荐 and autoplaying to me white supremacist videos 并且自动播放一些 白人至上主义的视频 in increasing order of extremism. 这些视频一个比一个更极端 If I watched one, 如果我看了一个 it served up one even more extreme 就会有另一个更加 极端的视频加入队列 and autoplayed that one, too. 并自动播放 If you watch Hillary Clinton or Bernie Sanders content, 如果你看有关 希拉里 克林顿 或者 伯尼 桑德斯 的内容 YouTube recommends and autoplays conspiracy left, YouTube就会开始推荐并 自动播放左翼阴谋内容 and it goes downhill from there. 并且愈演愈烈 Well, you might be thinking, this is politics, but it's not. 你也许觉得这和政治有关 This isn't about politics. 但事实上并不是这样 This is just the algorithm figuring out human behavior. 这只不过是算法在 学习人类行为而已 I once watched a video about vegetarianism on YouTube 我曾在YouTube上观看过 一个有关素食主义的视频 and YouTube recommended and autoplayed a video about being vegan. 然后YouTube就推送了 纯素主义的视频 It's like you're never hardcore enough for YouTube. 在YouTube上你就 好像永远都不够决绝 (Laughter) (笑声) So what's going on? 这到底是怎么回事儿 Now, YouTube's algorithm is proprietary, 现在YouTube有其专有的算法 but here's what I think is going on. 但我认为事情是这样的 The algorithm has figured out 这算法已经分析出了 that if you can entice people 如果能展示出更加核心的内容 into thinking that you can show them something more hardcore, 以此来诱惑网站用户 they're more likely to stay on the site 那么人们就更有可能沉浸在网页里 watching video after video going down that rabbit hole 一个接一个的观看推荐的视频 while Google serves them ads. 同时Google给它们投放广告 Now, with nobody minding the ethics of the store, 目前没有人在意网络的道德规范 these sites can profile people 这些网站可以对用户进行划分 who are Jew haters, 哪些人仇视犹太人 who think that Jews are parasites 哪些人视犹太人为寄生虫 and who have such explicit anti-Semitic content, 以及说过明显反犹太言论的人 and let you target them with ads. 然后让你面向这些 目标人群投放广告 They can also mobilize algorithms 他们也可以利用算法 to find for you look-alike audiences, 来找到和你类似的观众 people who do not have such explicit anti-Semitic content on their profile 那些个人账号中虽然没有过 明显的反犹太人言论 but who the algorithm detects may be susceptible to such messages, 但却被算法检测出 可能被这种言论影响的人 and lets you target them with ads, too. 然后也面向他们投放同样的广告 Now, this may sound like an implausible example, 这听起来难以置信 but this is real. 但确有其事 ProPublica investigated this ProPublica在这方面调查过 and found that you can indeed do this on Facebook, 发现这的确可以在Facebook上实现 and Facebook helpfully offered up suggestions Facebook还积极的就 有关如何将算法的受众 on how to broaden that audience. 再度扩大提出了建议 BuzzFeed tried it for Google, and very quickly they found, Buzzfeed曾在Google上 进行尝试 并很快发现 yep, you can do it on Google, too. 没错 这也可在Google实现 And it wasn't even expensive. 而这甚至花不了多少钱 The ProPublica reporter spent about 30 dollars ProPublica只花了大概30美元 to target this category. 就找出了目标人群 So last year, Donald Trump's social media manager disclosed 那么去年 特朗普的 社交媒体经理披露道 that they were using Facebook dark posts to demobilize people, 他们使用Facebook的 隐藏发帖来动员大众退出 not to persuade them, 不是劝告 but to convince them not to vote at all. 而是说服他们根本就不要投票 And to do that, they targeted specifically, 为了做到这一点 他们有 针对性的找到目标 for example, African-American men in key cities like Philadelphia, 比如 在费城这种关键城市里 居住的非裔美国人 and I'm going to read exactly what he said. 请注意接下来我要复述的 I'm quoting. 都是他们的原话 They were using "nonpublic posts 他们使用 以下是引用 由竞选者控制的 whose viewership the campaign controls 非面向公众的贴文发帖 so that only the people we want to see it see it. 这样就只有我们选定的人 可以看到其内容 We modeled this. 我们估算了一下 It will dramatically affect her ability to turn these people out." 这会极大程度的做到让这些人退出 What's in those dark posts? 以上我引述的隐藏贴文说了些什么呢 We have no idea. 我们无从知晓 Facebook won't tell us. Facebook不会告诉我们 So Facebook also algorithmically arranges the posts 所以Facebook也利用 算法管理贴文 that your friends put on Facebook, or the pages you follow. 不管是你朋友的发帖 还是你的跟帖 It doesn't show you everything chronologically. 它不会把东西按时间顺序展现给你 It puts the order in the way that the algorithm thinks will entice you 而是按算法计算的顺序展现给你 to stay on the site longer. 以使你更长时间停留在页面上 Now, so this has a lot of consequences. 而这一切都是有后果的 You may be thinking somebody is snubbing you on Facebook. 你也许会觉得有人在 Facebook上对你不理不睬 The algorithm may never be showing your post to them. 这是因为算法可能根本就 没有给他们展示你的发帖 The algorithm is prioritizing some of them and burying the others. 算法会优先展示一些贴文 而把另一些埋没 Experiments show 实验显示 that what the algorithm picks to show you can affect your emotions. 算法决定展示给你的东西 会影响到你的情绪 But that's not all. 还不止这样 It also affects political behavior. 它也会影响到政治行为 So in 2010, in the midterm elections, 在2010年的中期选举中 Facebook did an experiment on 61 million people in the US Facebook对美国6100万人 做了一个实验 that was disclosed after the fact. 这是在事后被披露的 So some people were shown, "Today is election day," 当时有些人收到了 今天是选举日 的贴文 the simpler one, 简单的版本 and some people were shown the one with that tiny tweak 而有一些人则收到了 微调过的贴文 with those little thumbnails 上面有一些小的缩略图 of your friends who clicked on "I voted." 显示的是你的 哪些好友 已投票 This simple tweak. 这小小的微调 OK? So the pictures were the only change, 看到了吧 改变仅仅是 添加了缩略图而已 and that post shown just once 并且那些贴文仅出现一次 turned out an additional 340,000 voters 后来的调查结果显示 in that election, 在那次选举中 according to this research 根据选民登记册的确认 as confirmed by the voter rolls. 多出了34万的投票者 A fluke? No. 仅仅是意外吗 并非如此 Because in 2012, they repeated the same experiment. 因为在2012年 他们再次进行了同样的实验 And that time, 而那一次 that civic message shown just once 类似贴文也只出现了一次 turned out an additional 270,000 voters. 最后多出了28万投票者 For reference, the 2016 US presidential election 作为参考 2016年总统大选的 was decided by about 100,000 votes. 最终结果是由大概 十万张选票决定的 Now, Facebook can also very easily infer what your politics are, Facebook还可以轻易 推断出你的政治倾向 even if you've never disclosed them on the site. 即使你从没有在网上披露过 Right? These algorithms can do that quite easily. 这可难不倒算法 What if a platform with that kind of power 而如果一个拥有 这样强大能力的平台 decides to turn out supporters of one candidate over the other? 决定要让一个候选者胜利获选 How would we even know about it? 我们根本无法察觉 Now, we started from someplace seemingly innocuous -- 现在我们从一个无伤大雅的方面 也就是如影随形的 online adds following us around -- 网络广告 and we've landed someplace else. 转到了另一个方面 As a public and as citizens, 作为一个普通大众和公民 we no longer know if we're seeing the same information 我们已经无法确认 自己看到的信息 or what anybody else is seeing, 和别人看到的信息是否一样 and without a common basis of information, 而在没有一个共同的 基本信息的情况下 little by little, 逐渐的 public debate is becoming impossible, 公开辩论将变得不再可能 and we're just at the beginning stages of this. 而我们已经开始走在这条路上了 These algorithms can quite easily infer 这些算法可以轻易推断出 things like your people's ethnicity, 任何一个用户的种族 宗教信仰 religious and political views, personality traits, 包括政治倾向 还有个人喜好 intelligence, happiness, use of addictive substances, 你的智力 心情 以及用药历史 parental separation, age and genders, 父母是否离异 你的年龄和性别 just from Facebook likes. 这些都可以从你的 Facebook关注里推算出来 These algorithms can identify protesters 这些算法可以识别抗议人士 even if their faces are partially concealed. 即使他们部分掩盖了面部特征 These algorithms may be able to detect people's sexual orientation 这些算法可以测出人们的性取向 just from their dating profile pictures. 只需要查看他们的约会账号头像 Now, these are probabilistic guesses, 然而所有的一切都 只是概率性的推算 so they're not going to be 100 percent right, 所以它们不会百分之百精确 but I don't see the powerful resisting the temptation to use these technologies 这些算法有很多误报 just because there are some false positives, 也必然会导致其他层次的种种问题 which will of course create a whole other layer of problems. 但我没有看到对想要使用这些 科技的有力反抗 Imagine what a state can do 想象一下 拥有了海量的市民数据 with the immense amount of data it has on its citizens. 一个国家能做出什么 China is already using face detection technology 中国已经在使用 to identify and arrest people. 面部识别来抓捕犯人 And here's the tragedy: 然而不幸的是 we're building this infrastructure of surveillance authoritarianism 我们正在建造一个 监控独裁性质的设施 merely to get people to click on ads. 目的仅是为了让人们点击广告 And this won't be Orwell's authoritarianism. 而这和奥威尔笔下的独裁政府不同 This isn't "1984." 不是 1984 里的情景 Now, if authoritarianism is using overt fear to terrorize us, 现在如果独裁主义公开恐吓我们 we'll all be scared, but we'll know it, 我们会惧怕 但我们也会察觉 we'll hate it and we'll resist it. 我们会奋起抵抗并瓦解它 But if the people in power are using these algorithms 但如果掌权的人使用这种算法 to quietly watch us, 来安静的监视我们 to judge us and to nudge us, 来评判我们 煽动我们 to predict and identify the troublemakers and the rebels, 来预测和识别出那些 会给政府制造麻烦的家伙 to deploy persuasion architectures at scale 并且大规模的布置说服性的架构 and to manipulate individuals one by one 利用每个人自身的 using their personal, individual weaknesses and vulnerabilities, 弱点和漏洞来把我们逐个击破 and if they're doing it at scale 假如他们的做法受众面很广 through our private screens 就会给每个手机都推送不同的信息 so that we don't even know 这样我们甚至都不会知道 what our fellow citizens and neighbors are seeing, 我们周围的人看到的是什么 that authoritarianism will envelop us like a spider's web 独裁主义会像蜘蛛网 一样把我们困住 and we may not even know we're in it. 而我们并不会意识到 自己已深陷其中 So Facebook's market capitalization Facebook现在的市值 is approaching half a trillion dollars. 已经接近了5000亿美元 It's because it works great as a persuasion architecture. 只因为它作为一个说服架构 完美的运作着 But the structure of that architecture 但不管你是要卖鞋子 is the same whether you're selling shoes 还是要卖政治思想 or whether you're selling politics. 这个架构的结构都是固定的 The algorithms do not know the difference. 算法并不知道其中的差异 The same algorithms set loose upon us 同样的算法也被使用在我们身上 to make us more pliable for ads 它让我们更易受广告诱导 are also organizing our political, personal and social information flows, 也管控着我们的政治 个人 以及社会信息的流向 and that's what's got to change. 而那正是需要改变的部分 Now, don't get me wrong, 我还需要澄清一下 we use digital platforms because they provide us with great value. 我们使用数字平台 因为它们带给我们便利 I use Facebook to keep in touch with friends and family around the world. 我和世界各地的朋友和家人 通过 Facebook 联系 I've written about how crucial social media is for social movements. 我也曾撰文谈过社交媒体 在社会运动中的重要地位 I have studied how these technologies can be used 我也曾研究过如何使用这些技术 to circumvent censorship around the world. 来绕开世界范围内的审查制度 But it's not that the people who run, you know, Facebook or Google 但并不是那些管理Facebook 或者Google的人 are maliciously and deliberately trying 在意图不轨的尝试 to make the country or the world more polarized 如何使世界走向极端化 and encourage extremism. 并且推广极端主义 I read the many well-intentioned statements 我曾读到过很多由这些人写的 that these people put out. 十分善意的言论 But it's not the intent or the statements people in technology make that matter, 但重要的并不是 这些科技人员说的话 it's the structures and business models they're building. 而是他们正在建造的 架构体系和商业模式 And that's the core of the problem. 那才是问题的关键所在 Either Facebook is a giant con of half a trillion dollars 要么Facebook是个 5000亿市值的弥天大谎 and ads don't work on the site, 那些广告根本就不奏效 it doesn't work as a persuasion architecture, 它并不是以一个 说服架构的模式成功运作 or its power of influence is of great concern. 要么Facebook的影响力 就是令人担忧的 It's either one or the other. 只有这两种可能 It's similar for Google, too. Google也是一样 So what can we do? 那么我们能做什么呢 This needs to change. 我们必须改变现状 Now, I can't offer a simple recipe, 现在我还无法给出 一个简单的方法 because we need to restructure 因为我们必须重新调整 the whole way our digital technology operates. 整个数字科技的运行结构 Everything from the way technology is developed 一切科技从发展到激励的方式 to the way the incentives, economic and otherwise, 不论是在经济 还是在其他领域 are built into the system. 都是建立在这种结构之上 We have to face and try to deal with 我们必须得面对并尝试解决 the lack of transparency created by the proprietary algorithms, 由专有算法制造出来的 透明度过低问题 the structural challenge of machine learning's opacity, 还有由机器学习的 不透明带来的结构挑战 all this indiscriminate data that's being collected about us. 以及所有这些不加选择 收集到的我们的信息 We have a big task in front of us. 我们的任务艰巨 We have to mobilize our technology, 必须调整我们的科技 our creativity 我们的创造力 and yes, our politics 以及我们的政治 so that we can build artificial intelligence 这样我们才能够制造出 that supports us in our human goals 真正为人类服务的人工智能 but that is also constrained by our human values. 但这也会受到人类价值观的阻碍 And I understand this won't be easy. 我也明白这不会轻松 We might not even easily agree on what those terms mean. 我们甚至都无法在这些 理论上达成一致 But if we take seriously 但如果我们每个人都认真对待 how these systems that we depend on for so much operate, 这些我们一直以来 都在依赖的操作系统 I don't see how we can postpone this conversation anymore. 我认为我们也 没有理由再拖延下去了 These structures 这些结构 are organizing how we function 在影响着我们的工作方式 and they're controlling 它们同时也在控制 what we can and we cannot do. 我们能做与不能做什么事情 And many of these ad-financed platforms, 而许许多多的 这种以广告为生的平台 they boast that they're free. 他们夸下海口 对大众分文不取 In this context, it means that we are the product that's being sold. 而事实上 我们却是他们销售的产品 We need a digital economy 我们需要一种数字经济 where our data and our attention 一种我们的数据以及我们专注的信息 is not for sale to the highest-bidding authoritarian or demagogue. 不会如竞拍一样被售卖给 出价最高的独裁者和煽动者 (Applause) (掌声) So to go back to that Hollywood paraphrase, 回到那句好莱坞名人说的话 we do want the prodigious potential 我们的确想要 of artificial intelligence and digital technology to blossom, 由人工智能与数字科技发展 带来的惊人潜力 but for that, we must face this prodigious menace, 但与此同时 我们也要 做好面对惊人风险的准备 open-eyed and now. 睁大双眼 就在此时此刻 Thank you. 谢谢 (Applause) (掌声)

萌ICP备20223985号