Neural architecture search network

Year: 2,018
Authors: Barret Zoph, Vijay Vasudevan, Jonathon Shlens, Quoc V. Le
Journal:  IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
Programming languages: Python, Shell

Method to optimize convolutional architectures on a dataset of interest. The key contribution of this work is the design of a new search space (which we call the “NASNet search space”) which enables transferability. In our experiments, we search for the best convolutional layer (or “cell”) on the CIFAR-10 dataset and then apply this cell to the ImageNet dataset by stacking together more copies of this cell, each with their own parameters to design a convolutional architecture, which we name a “NASNet architecture”. We also introduce a new regularization technique called ScheduledDropPath that significantly improves generalization in the NASNet models.

Sign In


Reset Password

Please enter your username or email address, you will receive a link to create a new password via email.