Abstract

We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary algorithms. Our approach uses a sequential model-based optimization (SMBO) strategy, in which we search for structures in order of increasing complexity, while simultaneously learning a surrogate model to guide the search through structure space. Direct comparison under the same search space shows that our method is up to 5 times more efficient than the RL method of Zoph et al. (2018) in terms of number of models evaluated, and 8 times faster in terms of total compute. The structures we discover in this way achieve state of the art classification accuracies on CIFAR-10 and ImageNet.

Keywords

Computer scienceReinforcement learningArtificial intelligenceConvolutional neural networkArtificial neural networkState (computer science)State spaceSearch algorithmMachine learningAlgorithmMathematics

Affiliated Institutions

Related Publications

Publication Info

Year
2018
Type
book-chapter
Pages
19-35
Citations
1934
Access
Closed

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

1934
OpenAlex
297
Influential

Cite This

Chenxi Liu, Barret Zoph, Maxim Neumann et al. (2018). Progressive Neural Architecture Search. Lecture notes in computer science , 19-35. https://doi.org/10.1007/978-3-030-01246-5_2

Identifiers

DOI
10.1007/978-3-030-01246-5_2
PMID
41041021
PMCID
PMC12484352
arXiv
1712.00559

Data Quality

Data completeness: 79%