Residual Networks Behave Like Ensembles of Relatively Shallow Networks
2018-02-06 21:01
316 查看
Abstract
本文是NIPS2016的文章,作者来自Cornell大学.本文主要是对residual networks进行解释。首先,使用解构的思想吧ResNet等价于一组不同长度的路径的集合;然后采用了lesion study的方式揭示了不同路径之间并没有强依赖,尽管它们是联合学习的;最后,通过实验验证了大多数路径是短的,主要是短路径在学习,长路径不贡献梯度.最终得出一个结论,ResNet 110层大多数梯度是来自10-34层深的路径,这样的话,ResNet并没有通过全局信息流动来解决梯度消散问题.
Introduction
ResNet和之前工作的主要不同在于(1)恒等的跳跃结构和传统的序列结构不同(2)比传统的网络深两个数量级(3)在测试时移除ResNet的几层并不会太多降低精度.Others
没太多可说的,具体实验看论文相关文章推荐
- 《Residual Networks Behave Like Ensembles of Relatively Shallow Networks》笔记
- 55. A man of words and not of deeds is like a garden full of weeds. 光说空话不做事,犹如花园光长刺
- A Sensitivity Analysis of (aConvolutional Neural Networks for Sentence C
- 【学习笔记】WEEK3_Practice Questions_Quiz_Shallow Neural Networks
- The Unreasonable Effectiveness of Recurrent Neural Networks
- [poj] 1236 networks of schools
- 论文笔记 | Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning
- 浅读——reducing the dimensionality of data with neural networks(一)
- Wide Residual Networks
- Towards Autonomic Management of Communications Networks (IEEE Communications Magazine • October 2007)
- GANs学习系列(7): 拉普拉斯金字塔生成式对抗网络Laplacian Pyramid of Adversarial Networks
- 读书笔记:Mastering the game of Go with deep neural networks and tree search
- 论文笔记— Identity Mappings in Deep Residual Networks
- The Unreasonable Effectiveness of Recurrent Neural Networks
- How do I like the idea of cloning?
- Coursera Deep Learning 第2课 Improving Deep Neural Networks 第一周 测验题 Practical aspects of deep learning
- Reliability of Computer Systems and Networks: Fault Tolerance, Analysis, and Design
- Reducing the Dimensionality of Data with Neural Networks:神经网络用于降维
- Pass4Sure Cisco CCSP 642-524 also called Securing Networks with ASA Foundation is a CCSP exam of Cisco company.
- 【笔记】Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning