Neural architecture search network
Year: 2,018
Journal: IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Programming languages: Python, Shell
Project website: https://github.com/MarSaKi/nasnet
Method to optimize convolutional architectures on a dataset of interest. The key contribution of this work is the design of a new search space (which we call the “NASNet search space”) which enables transferability. In our experiments, we search for the best convolutional layer (or “cell”) on the CIFAR-10 dataset and then apply this cell to the ImageNet dataset by stacking together more copies of this cell, each with their own parameters to design a convolutional architecture, which we name a “NASNet architecture”. We also introduce a new regularization technique called ScheduledDropPath that significantly improves generalization in the NASNet models.