Published 2025-05-15
Keywords
- Neural Architecture Search,
- AutoML,
- Automated Model Design,
- Deep Learning,
- Optimization Strategies
- AI Automation ...More
How to Cite
Copyright (c) 2025 IJCRT Research Journal | UGC Approved and UGC Care Journal | Scopus Indexed Journal Norms

This work is licensed under a Creative Commons Attribution 4.0 International License.
Abstract
The development of effective neural network architectures has traditionally relied on human expertise and extensive trial-and-error. Neural Architecture Search (NAS) offers a transformative solution by automating the model design process, significantly reducing manual effort and accelerating innovation in artificial intelligence. As a core technique within Automated Machine Learning (AutoML), NAS systematically explores a predefined set of neural architectures to identify models that best meet performance requirements under specific constraints. This study examines NAS as a foundation for building automated AI models, emphasizing its core components: search space, optimization strategy, and performance evaluation. Over time, NAS methods have progressed from resource-intensive approaches, such as reinforcement learning, to more efficient alternatives like evolutionary algorithms, differentiable search, and weight-sharing models. These innovations reduce computational costs and expand NAS applications across various domains, including computer vision, natural language processing, and speech recognition. By intelligently navigating vast architectural possibilities, NAS facilitates the creation of adaptable, high-performance AI systems and establishes itself as a key tool in the future of automated AI model development.