论文笔记:Inception-V4, Inception-ResNet
2016-06-17 12:36
246 查看
1. Motivation: residual connections work well for deep network ==> can be combined with Inception (Inception-ResNet)
2. architecture
a. replace filter concatenation of inception with residual
connection
(the 1 x 1 conv after inception layer aims to scale up
the dimension before adding to the input)
b. scaling down the residuals (multiple scaling factor 0.1~0.3) before addition ==> stabilize the training (prevent weights from going to 0)
2. Batch normalization: on top of traditional layers (excluding summation layer to reduce computational cost), prevent saturating.
3. performance.(for detailed differences bw models, please refer to the paper)
2. architecture
a. replace filter concatenation of inception with residual
connection
(the 1 x 1 conv after inception layer aims to scale up
the dimension before adding to the input)
b. scaling down the residuals (multiple scaling factor 0.1~0.3) before addition ==> stabilize the training (prevent weights from going to 0)
2. Batch normalization: on top of traditional layers (excluding summation layer to reduce computational cost), prevent saturating.
3. performance.(for detailed differences bw models, please refer to the paper)
相关文章推荐
- CUDA搭建
- 深入理解CNN的细节
- TensorFlow人工智能引擎入门教程所有目录
- convolutional neural network
- UFLDL Exercise: Convolutional Neural Network
- 使用深度卷积网络和支撑向量机实现的商标检测与分类的例子
- 对Pedestrian Detection aided by Deep Learning Semantic Tasks的小结
- 阅读 理解 思考 - Learning to Segment Object Candidates
- 卷积神经网络学习
- CNN: single-label to multi-label总结
- 总结:Large Scale Distributed Deep Networks
- 总结:One weird trick for parallelizing convolutional neural networks
- Extract CNN features using Caffe
- Deep Learning Face Attributes in the Wild
- 卷积神经网络CNN
- Tiled convolutional neural networks(TCNN)
- 卷积神经网络
- 卷积神经网络参数说明
- windows下的theano以及GPU加速环境的搭建
- 最受欢迎的新闻网站前15名(2014.10)