Type to SearchView Tags
Jayaramakrishnan Sundarraj

Artificial Neural Network, Neural Architecture Search (NAS) and its applications - Part 2
Jayaramakrishnan Sundararaj Technical Manager | May 21, 2021

Neural Network Architecture:

Till now, we understood what a neural network is — its types, and important parameters. Now we are going to see what neural network architecture means. To develop a neural architecture for a specific problem, one must properly understand the problem, select appropriate loss function, come up with number of layers and nodes per layer, and then tune other hyperparameters to avoid overfitting and underfitting.

The main deep neural network architectures can be categorized as below:

  1. Unsupervised Pre-trained Networks (UPN)
    • Deep Belief Networks (DBN)
    • Generative adversarial networks
    • Autoencoders
  2. Convolutional Neural Networks (CNN)
    • ResNet
    • AlexNet
    • LeNet-5
    • DenseNet
    • VGGNet
    • Xception
  3. Recurrent Neural Networks
    • Elman RNN
    • Neural tuning machines
    • Long Short-Term Memory (LSTM)
    • Gated Recurrent Unit (GRU)

Neural Architecture Search (NAS):

Normal machine learning or deep learning applications with human intervention in feature selection provides maximum accuracy of 70%-80% for computer vision or big datasets. This is mainly due to poor feature selection because of which the size of the artificial neural network increases. It is important to decide the features to be used, the number of neural layers as well as neurons for each layer. NAS is part of an automated machine learning (AutoML) algorithm to find the best combination of data preparations, hyperparameters, training, and evaluation of a model. NAS automates the complicated process of designing neural network architecture for target problems and helps overcome the issue of domain knowledge in machine learning. In simple words, NAS will find out the best performing neural architecture for a given problem.

Neural Architecture Search (NAS) automates the complicated process of designing neural network architecture for target problems.

Components of NAS are:

  1. Search space (Neural architecture space)
  2. Search algorithm
  3. Model evaluation strategy

The common NAS learning (optimization) methods are:

  1. Reinforcement Learning (RL)
  2. Evolutionary Algorithm (EA)
  3. Gradient Descent (GD)
  4. Bayesian optimization
  5. One-shot methods

Neural Architecture Search

NAS helps to automate machine learning ideas into reality with more accuracy, specifically without human interaction.

Some of the NAS (Neural Architecture Search) platforms available are:

  1. Autokeras
  2. Microsoft NNI
  3. Microsoft Archai
  4. Google AutoML
  5. Google NASNet
  6. NASBench for benchmark testing
  7. Google model_search

Currently, we have limited NAS-based algorithms available, as well as human-designed architectures for search space. But, the research community is working toward advanced NAS algorithms to make deep learning more accessible and more accurate for complex tasks. The research work on NAS is complicated and very rich; the intention of this blog is to give an overall view of NAS. Autokeras is the best starting point to understand NAS. For more information, please check the reference section.

NAS applications:

  • Image classification
  • Computer vision
  • Text classification
  • Data augmentation
  • Medical image classification and analysis
  • E-Commerce inventory prediction for big companies
  • Multi-task learning
  • Location-based services
  • Churn prediction to highlight valuable customers
  • Fraud detection, anomaly detection
  • Autonomous vehicle


  1. https://lilianweng.github.io/lil-log/2020/08/06/neural-architecture-search.html
  2. https://en.wikipedia.org/wiki/Neural_architecture_search
  3. https://github.com/microsoft/archai
  4. https://www.kdnuggets.com/2019/10/research-guide-neural-architecture-search.html
  5. NAIS: Neural Architecture and Implementation Search and its Applications in Autonomous Driving