Member-only story

ResNet Line by Line Explanation

Understand ResNet by Building One

Jeremy Zhang
7 min readFeb 10, 2023

Problem with Deep Neural Network

As a neural network goes deeper and deeper, theoretically, it would perform better at least on the training set, but in practice, the performance does not go better with a deeper network. One major reason is the gradient vanishing problem, in the backpropagation of a very deep network, the gradients at earlier layers would go to zero quickly that cause the learning process to be unbearably slow.

Photo by Marc-Olivier Jodoin on Unsplash

ResNet

ResNet makes it possible, theoretically, to build an infinite deep neural network without impairing the model performance, at least not getting worse. The major advanced structure in Resnet is called skip connection.

reference: deep-learning-course

The left side is a plain network without skip connection, the right side is the one with a skip connection.

skip connection allows activation values in earlier layers to be fast-forwarded to latter layers by addition. This has 2 advantages:

  • In the forward propagation, the latter layer would at least have the performance of earlier layers:

--

--

Jeremy Zhang
Jeremy Zhang

Written by Jeremy Zhang

Hmm…I am a data scientist looking to catch up the tide…

No responses yet