TY - JOUR AU - M Sampoorna AU - P Chaithanya AU - Pothuraju Jangaiah PY - 2025 DA - 2025/06/07 TI - A Comparative Analysis of Neural Architecture Search Techniques for Efficient Model Design JO - Global Journal of Engineering Innovations and Interdisciplinary Research VL - 5 IS - 3 AB - This study presents a comprehensive comparative analysis of four prominent Neural Architecture Search (NAS) techniques—NASNet, AmoebaNet, DARTS, and ProxylessNAS—with a focus on evaluating their effectiveness in designing efficient neural network models. By conducting experiments on the CIFAR-10 and ImageNet datasets, we assess these methods across several key metrics, including model accuracy, search time, computational cost, and energy consumption. The results reveal significant trade-offs among the techniques, with NASNet achieving the highest accuracy but at the cost of increased computational resources and energy usage. DARTS, on the other hand, demonstrates remarkable efficiency in terms of search time and resource utilization, albeit with a slight reduction in accuracy. This analysis highlights the importance of choosing a NAS method that aligns with the specific needs of the application, whether it be maximizing performance or optimizing for resource constraints. SN - 3066-1226 UR - https://dx.doi.org/10.33425/3066-1226.1112 DO - 10.33425/3066-1226.1112