A Comparative Analysis of Neural Architecture Search Techniques for Efficient Model Design
M Sampoorna, P Chaithanya, Pothuraju Jangaiah
This study presents a comprehensive comparative analysis of four prominent Neural Architecture Search
(NAS) techniques—NASNet, AmoebaNet, DARTS, and ProxylessNAS—with a focus on evaluating their
effectiveness in designing efficient neural network models. By conducting experiments on the CIFAR-10
and ImageNet datasets, we assess these methods across several key metrics, including model accuracy,
search time, computational cost, and energy consumption. The results reveal significant trade-offs among
the techniques, with NASNet achieving the highest accuracy but at the cost of increased computational
resources and energy usage. DARTS, on the other hand, demonstrates remarkable efficiency in terms of
search time and resource utilization, albeit with a slight reduction in accuracy. This analysis highlights
the importance of choosing a NAS method that aligns with the specific needs of the application, whether
it be maximizing performance or optimizing for resource constraints.