About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
AAAI 2021
Workshop paper
Large Scale Neural Architecture Search with Polyharmonic Splines
Abstract
Neural Architecture Search (NAS) is a powerful tool to automatically design deep neural networks for many tasks, including image classification. Due to the significant computational burden of the search phase, most NAS methods have focused so far on small datasets. All attempts at conducting NAS at large scale have employed small proxy sets, and then transferred the learned architectures to larger datasets by replicating or stacking the searched cells. We propose a NAS method based on polyharmonic splines that can perform search directly on large scale target datasets. We demonstrate the effectiveness of our method on the ImageNet22K benchmark1 (Deng et al. 2009), which contains 14 million images distributed over 21, 841 categories. By exploring the search space of the ResNet (He et al. 2016) and Big-Little Net ResNext (Chen et al. 2019a) architectures directly on ImageNet22K, our polyharmonic splines NAS method designed a model which achieved a top-1 accuracy of 40.03% on ImageNet22K, an absolute improvement of 3.13% over the state of the art with similar global batch size (Codreanu, Podareanu, and Saletore 2017).