Efficient Forward Architecture Search
- Hanzhang Hu ,
- John Langford ,
- Rich Caruana ,
- Saurajit Mukherjee ,
- Eric Horvitz ,
- Debadeepta Dey
Thirty-third Conference on Neural Information Processing Systems (NeurIPS 2019) |
We propose a neural architecture search (NAS) algorithm, Petridish, to iteratively add shortcut connections to existing network layers. The added shortcut connections effectively perform gradient boosting on the augmented layers. The proposed algorithm is motivated by the feature selection algorithm forward stage-wise linear regression, since we consider NAS as a generalization of feature selection for regression, where NAS selects shortcuts among layers instead of selecting features. In order to reduce the number of trials of possible connection combinations, we train jointly all possible connections at each stage of growth while leveraging feature selection techniques to choose a subset of them. We experimentally show this process to be an efficient forward architecture search algorithm that can find competitive models using few GPU days in both the search space of repeatable network modules (cell-search) and the space of general networks (macro-search). Petridish is particularly well-suited for warm-starting from existing models crucial for lifelong-learning scenarios.
Téléchargements de publications
Petridish: Efficient Forward Neural Architecture Search [0.1]
septembre 20, 2019
Petridish, is a neural architecture search (NAS) algorithm which iteratively adds shortcut connections to existing network layers. The added shortcut connections effectively perform gradient boosting on the augmented layers. The proposed algorithm is motivated by the feature selection algorithm forward stage-wise linear regression, since we consider NAS as a generalization of feature selection for regression, where NAS selects shortcuts among layers instead of selecting features.
Efficient Forward Architecture Search
AutoML for neural networks aims to take out the manual guess work of finding the right architecture for a given task. In this lecture we will see a quick overview of the state of neural architecture search and cover in-depth MSR’s recent algorithm. Specifically we propose a neural architecture search (NAS) algorithm, Petridish, to iteratively add shortcut connections to existing network layers. The added shortcut connections effectively perform gradient boosting on the augmented layers. The proposed algorithm is motivated by the feature selection algorithm forward stage-wise linear regression, since we consider NAS as a generalization of feature selection for regression, where NAS selects shortcuts among layers instead of selecting features. In order to reduce the number of trials of possible connection combinations, we train jointly…