Resnet BN

【深度學習】深刻理解Batch Normalization批標準化

https://www.zhihu.com/topic/20084849/hothtml

resnet(殘差網絡)的F(x)究竟長什麼樣子?git

https://www.zhihu.com/question/53224378github

如何理解微軟的深度殘差學習?網絡

https://www.zhihu.com/question/38499534?sort=created架構

 

SKIP CONNECTIONS ELIMINATE SINGULARITIESapp

https://arxiv.org/pdf/1701.09175.pdfpost

 詳解殘差網絡性能

https://zhuanlan.zhihu.com/p/42706477學習

 

殘差網絡原理url

https://blog.csdn.net/qq_30478885/article/details/78828734

https://www.coursera.org/lecture/convolutional-neural-networks/why-resnets-work-XAKNO

https://arxiv.org/pdf/1512.03385.pdf

https://www.quora.com/How-does-deep-residual-learning-work

https://arxiv.org/pdf/1603.05027.pdf

Resnet中殘差塊的做用是完成恆等變換,那這樣的恆等變換的意義是什麼,在網絡中能起到怎樣的做用呢?

https://www.zhihu.com/question/293243905

https://zhuanlan.zhihu.com/p/28124810 

https://arxiv.org/pdf/1502.03167v3.pdf

https://zhuanlan.zhihu.com/p/31645196

https://www.coursera.org/lecture/convolutional-neural-networks/why-resnets-work-XAKNO

https://arxiv.org/pdf/1506.01497v3.pdf

https://arxiv.org/pdf/1504.08083.pdf

https://arxiv.org/pdf/1311.2524v5.pdf

https://arxiv.org/pdf/1702.08591.pdf

https://arxiv.org/pdf/1611.05431.pdf

https://arxiv.org/pdf/1607.07032.pdf

 

https://arxiv.org/abs/1605.06431

Residual Networks 理解

協方差

https://www.zhihu.com/question/20852004

ResNet架構可逆!多大等提出性能優越的可逆殘差網絡

一文簡述ResNet及其多種變體

 

TensorFlow 實現 Resnet V2 代碼解讀

Identity Mapping in ResNet

1. 學習吳恩達在coursera的「深度學習課程」中關於殘差網絡的內容
2. 讀該模型的原版論文:Deep Residual Learning for Image Recognition,若是閱讀有難度,能夠參考網絡上的翻譯稿,這裏有一篇筆者的翻譯稿供參考。
3. 註冊github,用於查看和下載殘差網絡的開源源碼。註冊地址
4. 複製源代碼到本地。源碼地址在此

 

【1】He K, Zhang X, Ren S, et al. Deep residual learning for image recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2016: 770-778.

【2】Srivastava R K, Greff K, Schmidhuber J. Highway networks[J]. arXiv preprint arXiv:1505.00387, 2015.

【3】Orhan A E, Pitkow X. Skip connections eliminate singularities[J]. arXiv preprint arXiv:1701.09175, 2017.

【4】Shang W, Sohn K, Almeida D, et al. Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units[J]. 2016:2217-2225.

【5】Greff K, Srivastava R K, Schmidhuber J. Highway and Residual Networks learn Unrolled Iterative Estimation[J]. 2017.

【6】Jastrzebski S, Arpit D, Ballas N, et al. Residual connections encourage iterative inference[J]. arXiv preprint arXiv:1710.04773, 2017.

相關文章
相關標籤/搜索