Neural Architecture Search (NAS): Exploring the Trade-Offs In Automated Model Design and Its Impact on Deep Learning Performance
Authors: Gaurav Kashyap
DOI: https://doi.org/10.5281/zenodo.14382702
Short DOI: https://doi.org/g8vbg9
Country: USA
Full-text Research PDF File: View | Download
Abstract: The foundations of NAS (Neural Architecture Search) are covered in this paper, along with the trade-offs associated with automating model design and how they affect deep learning performance. It offers a fair assessment of the advantages and disadvantages of different search strategies and considers potential future developments in this area. An inventive and promising technique for automating deep neural network (DNN) design is NAS. The limitations of human-designed architectures may be addressed by NAS, which has the potential to find extremely effective and performant models by using algorithms to explore and optimize model architectures. The fundamental workings of NAS, the trade-offs associated with its application, and its effect on deep learning performance are all examined in this paper. We examine the various NAS approaches, including gradient-based techniques, evolutionary algorithms, and reinforcement learning (RL). The study also looks at the scalability problems, computational costs, and how NAS advances state-of-the-art models in a variety of fields, such as reinforcement learning, natural language processing, and image classification. Finally, we go over the present difficulties, possible future paths, and real-world uses of NAS.
Keywords: Neural Architecture Search (NAS), NLP (Natural Language Processing), RNN (Recurrent Neural Network). CNN (convolutional neural networks)
Paper Id: 231824
Published On: 2019-09-03
Published In: Volume 7, Issue 5, September-October 2019
Cite This: Neural Architecture Search (NAS): Exploring the Trade-Offs In Automated Model Design and Its Impact on Deep Learning Performance - Gaurav Kashyap - IJIRMPS Volume 7, Issue 5, September-October 2019. DOI 10.5281/zenodo.14382702