- AI, But Simple
- Posts
- Neural Architecture Search (NAS)
Neural Architecture Search (NAS)
AI, But Simple Issue #36

Hello from the AI, but simple team! If you enjoy our content, consider supporting us so we can keep doing what we do.
Our newsletter is no longer sustainable to run at no cost, so we’re relying on different measures to cover operational expenses. Thanks again for reading!
Neural Architecture Search (NAS)
AI, But Simple Issue #36
Before starting this issue, it would be helpful to understand how neural networks work while understanding their architecture and the mathematics behind them. If you don’t already, we have issues detailing this below:
Designing neural networks has always been a time-consuming and challenging process. For many years, researchers have experimented with various neural network architectures, adjusting the type of connections, tweaking the size and ordering of layers, and fine-tuning the parameters—all in search of better performance.
This process has always been based on trial and error, with experts tweaking layers and parameters based on intuition and experience. This works, but it can be improved, since at most, developing an effective architecture is simply an educated guess.
That being said, the standard method of finding model architectures has led to many of the most influential and useful neural network architectures, such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), Transformers, and Generative Adversarial Networks (GANs).
But finding the optimal architecture is an immense challenge, needing many years of research and millions of dollars in funding. The sheer complexity of designing models has led researchers to attempt to streamline this process.
In this issue, we’re going to explore one of the most exciting new waves of deep learning research: Neural Architecture Search (NAS). It is an algorithm that automates the search for the best-performing neural network architecture.

Before NAS, designing neural networks required months of testing different configurations. Now, NAS uses algorithms to explore thousands of designs automatically, saving time and often producing better results.
NAS is similar to a standard hyperparameter tuning process, however, it tunes the model architecture instead of fixing an architecture—doing so automatically using many different search approaches.
Seeking impartial news? Meet 1440.
Every day, 3.5 million readers turn to 1440 for their factual news. We sift through 100+ sources to bring you a complete summary of politics, global events, business, and culture, all in a brief 5-minute email. Enjoy an impartial news experience.
Neural Architecture Search Process
Search Space (What should we search?)
The search space defines all possible neural network architectures the algorithm can explore. This can range from simple architectures to highly complex ones.
For example, NAS can take into account layer types (convolutional layers for images, recurrent layers for text), connections (skip/residual connections, standard connections), the number of input/output channels, the number of neurons per layer, and the number of layers, just to name a few.