塑料抗菌剂法律总跟不上技术的发展,那么伦理呢?-译言

法律总跟不上技术的发展邢良坤,那么伦理呢?-译言

“如果科技能塑形我们风流特警 ,那么技术人员就是塑形那些科技的人,我们应该要求对科技人员有某种程度的道德操守培训。”
反思硅谷的伦理培训
Rethinking Ethics Training in Silicon Valley - The Atlantic
译者:lxdhk原文作者:Irina Raicu
I work at an ethics center in Silicon Valley.
我在硅谷的伦理中心工作。

I know, I know, “ethics” is not the first word that comes to mind when most people think of Silicon Valley or the tech industry. It’s probably not even in the top 10. But given the outsized role that tech companies now play何莉秀 , it’s time to focus on the ethical responsibilities of the technologists who help shape our lives.
我懂,我懂,大多数人想到硅谷或科技产业时,“伦理”不会是最先浮现脑海的第一个词。它甚至可能排不进前十。但考虑到科技公司现在发挥的巨大作用,现在是聚焦关注科技人员道德责任的时候了胡天阳 ,他们有份塑形我们的生活。
In a recent talk, technologist Maciej Ceglowski argued that “[t]his year especially there’s an uncomfortable feeling in the tech industry that we did something wrong, that in following our credo of ‘move fast and break things,’ some of what we knocked down were the load-bearing walls of our democracy.”
科技人马切伊.瑟罗斯基(MaciejCeglowski,Pinbord公司创始人)最近在一次演讲中说:“这一年,特别在科技行业,有一种令人不舒服的感觉,觉得我们做错了事情,季桃 在追随’快速前进,突破壁垒’信条时,被我们推到的,有些是我们民主制度的承重墙杨汉秀。”
This was not unforeseeable—or even unforeseen. In 2014, for example, in an article titled “How Facebook Is Shaping Who Will Win the Next Election,” Tarun Wadhwa cited a study published in 2012: “A 61-Milion-Person Experiment in Social Influence and Political Mobilization.” The study’s authors reported on “a randomized controlled trial of political mobilization messages delivered to 61 million Facebook users during the 2010 US congressional elections. The results show that the messages directly influenced political self-expression, information seeking and real-world voting behaviour of millions of people.”
这不是不可预见的——甚至是已被预见的。例如,塑料抗菌剂 2014年,塔伦.瓦德瓦(TarunWadhwa)在一篇名为“Facebook如何塑形下一届选举的赢家”的文章中引用了2012年发表的一项研究:“社会影响和政治动员的6100万人实验”。该研究的作者报告了“2010年美国国会选举期间向6100万Facebook用户提供政治动员信息的随机对照试验。结果表明,消息直接影响了数百万人的政治自我表达、信息寻求和现实世界的投票行为。
So we already knew that tools for “maximizing engagement” can shape the political sphere. In 2014, Wadhwa concluded, “Whether it wants this responsibility or not, Facebook has now become an integral part of the democratic process globally.”
所以我们已经知道花田阿花 ,“最大化参与”的工具可以塑形政治领域。2014年,瓦德瓦总结说:“无论Facebook是否想负此责任,它现在已成为构成全球民主程序的一个重要组成部分。”

We also know that technology can be harmful to our democracy. Privacy invasions and algorithmic manipulation, for example, can limit the ability to research and formulate opinions, and then in turn affect how people express views—even via voting. When companies implement practices that are good for targeted advertising but bad for individuals’ democratic engagement (like, for example蔡玥 , the practices involved in the use of “dark posts” on Facebook, tied to the creation of psychological profiles for hundreds of millions of Facebook users in the U.S.), the benefits-versus-harms balance tilts pretty sharply.
我们也知道科技可能对我们的民主有害。例如,入侵私隐和算法操纵可以限制探索和表达意见的能力,从而影响人们如何表达观点——甚至其投票。当公司实施有利于加强广告目标性,但不利于个体民主参与的做法(例如,与牵涉美国上亿Facebook用家的心理档案的Facebook“黑帖”事件有关联的做法)珍人真事 ,利害平衡的倾斜便相当明显。
Who minds that balance寻书网?
谁在乎它平衡不平衡?
You often hear the adage that law can’t keep up with technology. What about ethics? Ethics, too, is deliberative, and new norms take some time to develop; but an initial ethical analysis of a new development or practice can happen fairly quickly. Many technologists, however, are not encouraged to conduct that analysis, even superficially. They are not even taught to spot an ethical issue—and some (though certainly not all) seem surprised when backlash ensues against some of their creations. (See, for example, the critical coverage of the now-defunct Google Buzz, or more recent reaction to, say, “Hello Barbie.”)
我们经常听到一句格言:法律总跟不上技术的发展。那么伦理呢?伦理也是小心谨慎的,新的规范发展需时;但对新的发展和实践可以相当快地作出初步伦理分析。然而,许多技术人员并不被鼓励进行这种分析,即使是肤浅表面的分析。甚至没人教过他们去关注伦理的问题——有些人(虽然肯定不是全部人)甚至对他们的作品招来反弹而感到惊讶(例如,参见关于现已停用的GoogleBuzz的批评报道,或者是对HelloBarbie的最新反应)吴思潇 。
A growing chorus has argued that we need a code of ethics for technologists. That’s a start, but we need more than that. If technology can mold us, and technologists are the ones who shape that technology, we should demand some level of ethics training for technologists. And that training should not be limited to the university context; an ethics training component should also be included in the curriculum of any developer “bootcamp耿姝 ,” and maybe in the onboarding process when tech companies hire new employees.
越来越多的呼声认为,需要有一个用于科技人员的道德守则。那只是第一步,我们需要的更多妾妖娆 。如果科技能塑形我们,那么技术人员就是塑形那些科技的人,我们应该要求对科技人员有某种程度的道德操守培训。而且这种培训不应该局限于大学背景;任何开发人员的“新兵训练营”课程里都应包含道德操守培训的部分,也许在科技公司聘请新员工的入职过程中就需要如此。
Such training would not inoculate technologists against making unethical decisions—nothing can do that, and in some situations we may well reach no consensus on what the ethical action is. Such training, however, would prepare them to make more thoughtful decisions when confronted, say, with ethical dilemmas that involve conflicts between competing goods. It would help them make choices that better reflect their own values.
这种培训不会让技术人员免疫于做出不道德的决定——那是不可能的,某些情况下,我们可能对什么才是道德的行为也无法达成共识。然而,这种培训将使他们有准备面对被卷入竞争产品之间冲突的道德困境,让他们做出更周全的决定斗破之萧寒。这将有助于他们做出更好反映出自己价值观的选择。
Sometimes, we need consumers and regulators to push back against Big Tech. But in his talk titled “Build a Better Monster: Morality, Machine Learning, and Mass Surveillance富永爱 ,” Maciej Ceglowski argues that “[t]he one effective lever we have against tech companies is employee pressure. Software engineers are difficult to hire, expensive to train, and take a long time to replace.” If he is right, then tech employees might have even more power than people realized—or at least an additional kind of power they can wield. All the more reason why we should demand that technologists get at least some ethics training and recognize their role in defending democracy.
有时候,我们需要消费者和监管机构来抵制科技巨头。但是,瑟罗斯基在他题为“打造更好的巨怪:道德、机器学习和大众监视”的演讲中指出,“我们抑制科技公司的一个有效杠杆是员工的压力。软件工程师难以聘请,培训费用昂贵李连正 ,而且需要很长时间才能更换。”如果他说得正确的话,那么技术员工可能比人们意识到的更有力量——至少他们能行使一种新增的力量。所以我们更有理由要求科技人员至少有一些道德操守培训,并承认他们在维护民主方面的作用。
I work in an applied ethics center, and we do believe that technology can help democracy (we offer a free ethical-decision-making app, for example; we even offer a MOOC—a free online course—on ethical campaigning!). For it to do that, though, we need people who are ready to tackle the ethical questions—both within and outside of tech companies.
我在一个应用伦理中心工作,我们相信科技可以帮助民主(例如,我们提供了一个免费的伦理决策APP;我们甚至提供了一个关于道德运动的慕课——免费的在线课程)。然而,为了做到这一点,我们需要已准备好去处理伦理问题的人才,既需要科技公司内部的人,也需要科技公司外部的人。
版权声明:
本译文仅用于学习和交流目的。非商业转载请注明译者、出处,并保留文章在译言的完整链接。商业合作请联系 editor@yeeyan.com 参考原文地址:https://www.theatlantic.com/technology/archive/2017/05/rethinking-ethics-training-in-silicon-valley/525456/

扫描二维码关注译言,获取优质译文资源,享受优质便捷的即时译服务。↙点击下方“阅读原文”,走进古登堡图书~