DenseNet Paper Walkthrough: All Connected

When we try to train a very deep neural network model, one issue that we might encounter is the vanishing gradient problem. This is essentially a problem where the weight update of a model during training slows down or even stops, hence causing the model not to improve. When a network is very deep, the […]

The post DenseNet Paper Walkthrough: All Connected appeared first on Towards Data Science.