Dense Convolutional Networks

Year: 2,017
Authors: Gao Huang, Zhuang Liu, Laurens van der Maaten, Kilian Q. Weinberger
Journal:  IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Programming languages: Lua

Dense Convolutional Networks (DenseNet) connect each layer to every other layer in a feed-forward-fashion. Whereas traditional convolutional networks with L layers have L connections-one between each layer and its subsequent layer, this network has (Lx(L+1)/2) direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters.

Sign In


Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.