Go homepage(回首页)
Upload pictures (上传图片)
Write articles (发文字帖)

The author:(作者)
published in(发表于) 2017/1/2 6:51:01
Count of three Silicon Valley companies in 2016 made the error: user is always right,

English

中文

Count of three Silicon Valley companies in 2016 made the error: user is always right-Silicon Valley, technology, AI-IT information

Past 2016, the Silicon Valley startup company, Unicorn and listed companies made a lot of mistakes. The fast company Web site lists three problems, as a reference for the future development of technology companies.

1, workforce diversity do not go backwards

Two years ago, Apple, Facebook, Google, Twitter and other leading companies released its diversity report, hoping to change the workforce constitutes a single question. However, while all companies are positioning diversity as a priority, but promote inclusiveness, progress has been slow. In most of these companies, the percentage of black employees has been lower than the 5%, and the proportion of female employees is less than 1/3. In the management and technical positions, the lower proportion. Given the lack of progress, a number of companies, including Twitter, Pinterest, eBay and Salesforce, delayed the release of latest diversity report.

Despite little progress, but to give up staff's attempts are not a good idea. Report on diversity, the key point is to ensure that diversity is a high-priority task. Reports indicate that employees in the top technology companies is still very single. In order to promote ethnic, racial and gender diversity, we need to continue to explore the issue of inclusive.

2, algorithm limitations

In 2016, Facebook attempts to use technology to replace human judgement, it proved to us the limitations of computer algorithms. The end of 2015, Facebook announced in Paris immediately after the attack on security check is enabled, and after suicide attacks in Beirut did not do so. This has drawn criticism. At that time, Facebook says, Paris terror attack was the first terrorist attack in the security confirmation feature is enabled (this function is usually used to natural disasters). In 2016, the Facebook design a way, when many people discuss local devastation triggered automatically confirm safety features. Emergency response organizations concerned for Facebook launched a feature, because Facebook has the right to judge, whether regarded as crises.

Ruibeika·gusitafusen (Rebecca Gustafson) in 2016, said earlier: "in the field of emergency response, we're talking about the most important problem is that more terrible error information than no information. Be criticized emergency response organizations response too slowly, but the pace of technology companies have been distorted. ”

In 2016, the United States in the election cycle, false news also has been criticized on the Facebook platform. Question about Facebook spreading false news, Facebook CEO Mark Zuckerberg (Mark Zuckerberg) that the Community tags will help prevent false content, and Facebook does not want to be the arbiter of fact. Both safety confirmation function is the dissemination of false news suggests that algorithms can't tell the difference between real and exaggerated information. In the wake of public pressure, Facebook announced that it would cooperate with the third party, marking of false news. Future news will be marked suspect, sorted in the news flow will be lower.

In 2016, however, brought about by one rule of thumb is that inefficiency cannot be completely through the machine to solve many issues still require manual intervention.

3, the user is always right

In 2016, once again, we see how important meet user needs. Facebook is one such example, and Evernote is another example. Recently, Evernote has issued new privacy policy requiring user's notes to Evernote staff responsible for the machine learning techniques to assess periodically. Evernote CEO explained, sharing notes will be removed identifying information to employees, but users still worried. Raised after a powerful rally, Evernote decision to make voluntary participation in this project. Under the new policy, only the users who agree to participate in the project, will be sharing notes to Evernote employees.

Jessica Alba (Jessica Alba) Honest Company is accused of changing laundry detergent recipe. After the Wall Street Journal reported that the sulfate, sodium Honest Company used coconut oil. In 2016, the Honest Company has suffered a series of consumer complaints, questioning the company's products are really natural or organic. Honest Company told the Wall Street Journal said the adjustment formula was designed to "optimize product."

For the year 2017, have any significance? As technology companies have a more in-depth exploration of machine learning, driverless cars, as well as other types of automation, need to keep in mind is that we want to see the technical limitations and remember the artificial areas of expertise. Transparency is key to withhold information would damage relationships previously established. In addition, in developing new products, need to think about your business can bring positive change to the world, and to listen to user's views.


盘点硅谷公司2016年犯过的三大错误:用户永远正确 - 硅谷,科技,人工智能 - IT资讯

过去的2016年,硅谷的创业公司、独角兽和上市公司犯过许多错误。《快公司》网站列出了其中的三大问题,为科技公司的未来发展提供借鉴。

1、员工队伍多元化不要开倒车

两年前,苹果Facebook谷歌、Twitter和其他主流公司发布了员工多元化报告,希望改变员工队伍构成单一的问题。然而,尽管所有公司都将多元化定位为优先工作,但提升包容性的进展一直很慢。在大部分这些公司,黑人员工的比例一直低于5%,而女员工的比例则不到1/3。在管理岗和技术类岗位上,这些比例还要更低。由于缺乏进展,多家公司,包括Twitter、Pinterest、eBay和Salesforce,都推迟了最新多元化报告的发布。

尽管进展不大,但放弃员工队伍多元化的努力并不是个好主意。关于多元化报告,最关键的一点是,确保多元化是一项高优先级工作。报告显示,顶级科技公司中的员工构成仍然非常单一。为了推动民族、种族和性别的多元化,我们需要继续探讨包容性的问题。

2、算法存在局限

2016年,Facebook曾多次尝试用技术去取代人工判断,这也向我们证明了计算机算法存在的局限。2015年底,Facebook在巴黎的公布袭击之后立即启用了安全确认产品,而在贝鲁特的自杀式袭击后却没有这样做。这遭到了批评。当时,Facebook表示,巴黎恐怖袭击是该公司首次在恐怖袭击事件中启用安全确认功能(该功能通常用于自然灾害)。2016年,Facebook设计了一种方式,当许多人讨论当地的灾难事件时自动触发安全确认功能。紧急响应组织对于Facebook这一功能的推出感到担忧,因为Facebook将有权判断,局面是否算得上危机。

瑞贝卡·古斯塔夫森(Rebecca Gustafson)2016年早些时候表示:“在紧急响应领域,我们讨论的最重要问题在于,错误的信息比没有信息更可怕。紧急响应组织的反应太慢会遭到批评,但科技公司的速度已经扭曲。”

在2016年的美国大选周期中,Facebook平台上的假新闻问题也备受批评。关于Facebook传播不实新闻的问题,Facebook CEO马克·扎克伯格(Mark Zuckerberg)表示,社区的标记将有助于阻止虚假内容,而Facebook并不希望成为事实的仲裁者。无论是安全确认功能还是假新闻的传播都表明,算法还无法区别真实信息和夸大的信息。在受到公开压力之后,Facebook宣布将与第三方合作,标记虚假新闻。未来,可疑的新闻将会被标记,在消息流中的排序将降低。

不过,2016年带来的一条经验在于,低效问题无法完全通过机器去解决,许多问题仍需要人工介入。

3、用户永远正确

2016年,我们再次看到,满足用户需求有多么重要。Facebook是这样的例子之一,而Evernote则是另一个例子。近期,Evernote发布了新的隐私政策,规定用户的笔记将被Evernote负责机器学习技术的员工定期评估。Evernote CEO解释称,分享给员工的笔记将会被删除识别信息,但用户仍对此感到担忧。在引发强烈反弹之后,Evernote决定将这一项目改为自愿参与。根据新政策,只有对同意参加项目的用户,笔记才会被分享给Evernote员工。

杰西卡·阿尔芭(Jessica Alba)的Honest Company被控更改了洗衣粉的配方。此前《华尔街日报》报道称,Honest Company使用了椰油醇硫酸酯钠。2016年,Honest Company遭遇了一系列的用户投诉,质疑该公司的产品是否真的天然或有机。Honest Company对《华尔街日报》表示,该公司调整配方是为了“优化产品的效果”。

对于2017年,这些有何借鉴意义?随着科技公司更深入地探索机器学习、无人驾驶汽车,以及其他类型的自动化,需要记住的一点是,我们要看到技术的局限性,并记住人工所擅长的领域。透明度很关键,隐瞒信息将损害此前所建立的关系。此外,在开发新产品时,需要思考你的业务能给世界带来哪些积极改变,并倾听用户的意见。





If you have any requirements, please contact webmaster。(如果有什么要求,请联系站长)





QQ:154298438
QQ:417480759