Go homepage(回首页)
Upload pictures (上传图片)
Write articles (发文字帖)

The author:(作者)
published in(发表于) 2016/6/23 8:46:18
Baidu’s AI Lab in Silicon Valley to publish new results, recurrent neural networks to explode,

English

中文

Baidu's AI Lab in Silicon Valley to publish new results, recurrent neural networks ready to explode-Baidu-IT information

On June 23 news, Baidu's artificial intelligence laboratory in Silicon Valley recently released results of a new technology, which can speed up the deep recursive neural networks (RNNs) training.

The technology recently introduced by Baidu's AI Lab in Silicon Valley scientists posted on github Jesse Engel, Jesse Engel, said previously published results of this first stage of technology development, is concerned Minibatch and storage configuration in recursive General matrix multiply (GEMM) performance of the role. This release of the second phase will focus on the optimization of the algorithm itself.

Jesse mentioned in the article, Differentiable Graphs (minimal graphics) for calculation of complex derivatives is a simple, functional and Visual Tools, but also can stimulate the optimization algorithm. For use with explicit gradient to calculate functional framework of researchers, development of new iterative algorithm based on personnel, as well as researchers at the development and application of automatic differentiation of deep learning framework, this new technology will better enhance research and development capabilities.

In recent years, with big data, and mass calculation capacity, and complex model and efficient algorithm of development in-depth, for depth learning of training data annual multiplied growth, image reached tens of millions of training sample, and voice reached tens of billions of training sample, and advertising reached hundreds of millions of training sample; and meanwhile, more machine GPU/CPU distributed calculation capacity also get significantly strengthening, model Shang is by large linear model evolution to tree model, can put features sent to different machine; in algorithm Shang, The application deep neural network of distributed algorithms. Deep learning technology is gradually entering all areas of mobile Internet applications, mobile Internet products to become more intelligent and user-friendly.

The future, this technology can be applied in more on Baidu, Baidu deep learning technology research and development and application in Baidu's service.

Worth a mention of is, with Baidu Silicon Valley laboratory into stable development period, Baidu Silicon Valley is Qian "Google brain of parent" Wu Enda of led Xia, around with GPU upgrade calculation efficiency, processing mass training data, voice recognition, OCR recognition, people face recognition, image search level, aspects for research, and as Baidu Chief Scientist, Wu Enda I personally is responsible for voice recognition and no car technology development.

Public information, Baidu was established in 2013 the deep learning Institute (IDL), the brain May 2014 poaching Google's father Wu Enda, as Chief Scientist and leading company in Silicon Valley Research Institute.


百度硅谷AI实验室发布新成果,递归神经网络要爆发 - 百度 - IT资讯

6月23日下午消息,百度硅谷人工智能实验室近日发布了一项新的研究成果技术,该技术可以加快深层递归神经网络(RNNs)的训练速度。

该项技术于近日由百度硅谷AI实验室的科学家Jesse Engel在github上发布,Jesse Engel表示,此前已经发表过这项技术的第一阶段研发成果,关注的是Minibatch和存储配置在递归通用矩阵乘法(GEMM)的性能上所发挥的作用。本次发布的第二阶段将重点关注对算法本身的优化。

Jesse在文章中提到,Differentiable Graphs(可微图形)对于计算复杂的衍生工具是一个简单、实用又可视化的工具,同时也可以激发算法的优化。对于需要使用具有明确梯度计算功能框架的研究人员、开发新的迭代算法的研究人员以及开发应用自动分化深度学习框架的研究人员,这项新的技术将更好地提升研发能力。

近年来,随着大数据、大规模计算能力、复杂模型和高效算法的研发深入,用于深度学习的训练数据每年成倍增长,图像达到数千万训练样本、语音达到数百亿训练样本、广告达到数千亿训练样本;而与此同时,多机GPU/CPU分布式计算能力也得到显著加强,模型上则由大型线性模型演进到树模型,可把特征发到不同机器;在算法上,则应用了深度神经网络等分布式算法。深度学习技术正在逐步进入移动互联网应用的各个领域,让移动互联网的产品变得更加智能化和人性化。

未来,该项技术将可以被应用在更多的百度产品上,促进百度深度学习技术的研发及在百度各项应用服务中的应用。

值得一提的是,随着百度硅谷实验室进入稳定研发期,百度硅谷正在前“谷歌大脑之父”吴恩达的带领下,围绕用GPU提升计算效率,处理海量训练数据,语音识别,OCR识别,人脸识别,图像搜索水平等方面进行研究,而作为百度首席科学家,吴恩达本人亲自负责语音识别和无人车技术研发。

公开资料显示,百度于2013年正式成立了深度学习研究院(IDL),2014年5月挖角谷歌大脑之父吴恩达,出任首席科学家并领导百度硅谷研究院工作。






If you have any requirements, please contact webmaster。(如果有什么要求,请联系站长)





QQ:154298438
QQ:417480759