Skip to content

Investigating the affect of Brain-Inspired features on CNNs (for the task of Image Classification)

License

Notifications You must be signed in to change notification settings

Anshumaan-Chauhan02/VisionNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VisionNet

Project Description

There are several brain-inspired characteristics that have gained a lot of popularity in the past few years, either due to their ability of performing computations efficiently - spiking neurons, or because of their better performance on tasks - attention mechanism. In the field of computer vision, the way Convolutional Neural Networks (CNNs) processes the images differs significantly from how a brain process a vision. We perform an investigative study, that incorporates brain-inspired features such as i) Attention, ii) Multi-Feature Extraction and iii) Lateral Connections into a CNN architecture and observe the affects of these features on the performance metrics (accuracy) of the architecture on the task of image classification. Experiments show that, brain-inspired characteristics in the architecture lead to improvement in performance on the task of image classification on CIFAR10 and CIFAR100 by 1.6% and 3.35% respectively.

Technical Skills

Python Keras TensorFlow PyTorch scikit-learn Pandas NumPy Jupyter Notebook

Installing Machine Learning Libraries

TensorFlow
  !pip install tensorflow
Keras
  !pip install keras
PyTorch
  https://pytorch.org/get-started/locally/
Pandas
  !pip install pandas
NumPy
  !pip install numpy
Matplotlib
  !pip install matplotlib

File Content and Description

  • Baselines.ipynb: Implementation of Baseline Models using Vanilla Finetuning and Transfer Learning on CIFAR10 and CIFAR100
  • Attention-CNN.ipynb: Contains code for proposed three self-attention based architectures. Contains heat map generation for error analysis

How to Run

  1. Download the ipynb files from the repository
  2. Create a new folder named 'CS591 Final Project - AttentionCNN' on Google Colab (can also have other name for the folder, but then edit the file locations in the Attention-CNN.ipynb)
  3. Upload both the notebooks in the parent folder created
  4. To get the baseline model's performance run Baselines.ipynb file
  5. Run Attention-CNN.ipynb for execution of the proposed architectures on CIFAR10 and CIFAR100




Anurag’s github stats Top Langs

About

Investigating the affect of Brain-Inspired features on CNNs (for the task of Image Classification)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published