A list of papers I used for my thesis about convolutional neural networks and batch normalization
-
Updated
Apr 5, 2016
A list of papers I used for my thesis about convolutional neural networks and batch normalization
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
Using advanced deep learning techniques on the MNIST dataset. Over 98% validation set accuracy.
Code for my master thesis about convolutional neural networks and batch normalization
My solution to the 2nd assignment of UdelaR's Deep Learning course, based on Stanford CS231n.
MXNet Code For Demystifying Neural Style Transfer (IJCAI 2017)
Using the SVHN datset, a deep cnn model is trained in pytorch with a slightly different training routine and another straight forward cnn model in keras
TensorFlow implementation of real-time style transfer using feed-forward generation. This builds on the original style-transfer algorithm and allows for common personal computers to transform images.
This is a fork of caffe added some useful layers, the original caffe site is https://github.com/BVLC/caffe.
Comparison of various weight and bias initializers
Batch Normalization in standard neural nets using TensorFlow
Implementation of different networks for MNIST
Library which can be used to build feed forward NN, Convolutional Nets, Linear Regression, and Logistic Regression Models.
ImageNet pre-trained models with batch normalization for the Caffe framework
ConvNet in Tensorflow to study the effect of Batch Normalization on the CIFAR-10 dataset
Why Batch Normalization Works so Well (best peer-reviewed project at MLDS, 2017 Spring)
Deep learning models in Python
A Tensorflow implementation of the models described in the paper "Efficient Deep Learning for Stereo Matching"
An image recognition/object detection model that detects handwritten digits and simple math operators. The output of the predicted objects (numbers & math operators) is then evaluated and solved.
Add a description, image, and links to the batch-normalization topic page so that developers can more easily learn about it.
To associate your repository with the batch-normalization topic, visit your repo's landing page and select "manage topics."