Skip to content

Unofficial implementation of the state-of-the-art NAS built model

Notifications You must be signed in to change notification settings

wannieman98/RandWireNN

Repository files navigation

RandlyWiredNN

PWC

This is an Unofficial implementation of: Randomly Wired Neural Network(In Progress).

Progress

Model Dataset mAP lr epochs Optimizer Date
RandWire(WS, 4, 0.75), C=78 small regime VOC 2012 0.47 1e-3 50 Adam 12/16/21
RandWire(WS, 4, 0.75), C=78 small regime VOC 2012 0.521 1e-3 100 Adam 12/18/21

Project Overview

In this project, I am going to implement one of the image classification model generated using one of the state-of-the-art NAS method.

NAS (Computer Vision)

NAS, Neural Architecture Search, is essentilly an algorithm that allows auomatic buildilng of a neural network model given parameters. Unlike hand-designed NN architectures such as DenseNet, ResNet, etc., an NAS lets the heuristics design how to information is processed and the model will learn the representations of image.

Randomly Wired NN

In the paper Randomly Wired Neural Network, it is proposed that NAS is meant for the architecture to be built freely with lack of human bias. However, existing NAS methods have limited the search methods and this paper attempts to let the architecture be build more freely using graph algorithms.

Datasets (ordered by magnitude)

As the datasets are ordered by magnitude, they will be trained sequentially and if the classification accuracy reaches a threshhold then I will move on the next largest dataset.

Objective

The objective of this project is for me to gain better understanding of the NAS in general and in the state-of-the-art techniques of NAS. As the project progresss, I plan to adapt the Randomly Wired Neural Network model to object detection as well.

Usage

The model can be trained via script call:

python train.py [-h] [--p P] [--k K] [--m M] [--lr LR] [--path PATH] [--dataset DATASET] [--channel CHANNEL] [--is_train IS_TRAIN] [--num_node NUM_NODE] [--num_epoch NUM_EPOCH] [--graph_mode GRAPH_MODE] [--batch_size BATCH_SIZE] [--in_channels IN_CHANNELS] [--is_small_regime IS_SMALL_REGIME] [--checkpoint_path CHECKPOINT_PATH] [--load LOAD]

After the model has been trained you can test the best model via script call:

python test.py [-h] [--model_path MODEL_PATH] [--data_path DATA_PATH]

Tasks

  • Base NN architecture from which the Network Generator will build the model
  • Graph algorithm for the Network Generator
    • Erdo ̋s-Re ́nyi (ER)
    • Baraba ́si-Albert (BA)
    • Watts-Strogatz (WS)
  • Network Generator (On-Going)
  • Dataset pipelines
  • Training Functions
  • Testing Functions
  • Script to train the model
  • Train on VOC 2012 dataset
  • Train on ImageNet dataset

Requirements

  • You can install the required modules via commandline:
    pip install -r requirements.txt
    

Author

Seungwan Yoo / @LinkedIn

License

Under Apache License 2.0