Skip to content

ALPHAYA-Japan/autoencoders

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AutoEncoder Models

A collection of autoencoder models, e.g. Vanilla, Stacked, Sparse in Tensorflow.

How to use?

  • Command 1: python train.py ae_name train
  • Command 2: python train.py ae_name generate
  • Command 3: python train.py ae_name generate path/to/image

Note: Generated samples will be stored in images/{ae_model}/ directory during training.

Autoencoders

Traditional Autoencoders

The following papers are just examples on how to use the implemented autoencoders.
We did not mean to implement what have been described in each paper in details.

AutoEncoder Loss Function
Vanilla_AE
Stacked_AE
Conv_AE
Denoising_AE
Sparse_AE

Results for MNIST

The following results can be reproduced with command:

python train.py ae_name train

Note: 1st and 3rd rows represent the ground truth whereas the 2nd and 4th rows are the generated ones.

Name Epoch 1 Epoch 15 Epoch 30
Vanilla_AE
Stacked_AE
Conv_AE
Sparse_AE
Denoising_AE

Dependencies

  1. Install miniconda https://docs.conda.io/en/latest/miniconda.html
  2. Create an environment conda create --name autoencoder
  3. Activate the environment source activate autoencoder
  4. Install [Tensorflow] conda install -c conda-forge tensorflow
  5. Install [Opencv] conda install -c conda-forge opencv
  6. Install [sklearn] conda install -c anaconda scikit-learn
  7. Install [matplotlib] conda install -c conda-forge matplotlib

Datasets

If you wanna try new dataset, please make sure you make it in the following way:

  • Dataset_main_directory
    • train_data
      • category_1: (image1, image2, ...)
      • category_2: (image1, image2, ...)
      • ...
    • test_data
      • category_1: (image1, image2, ...)
      • category_2: (image1, image2, ...)
      • ... The loader.py file will automatically upload all images and their labels (category_i folders)

Acknowledgements

This implementation has been based on the work of the following great repositories: