Skip to content

Resources and Implementations (PyTorch) for Information Theoretical concepts in Deep Learning

Notifications You must be signed in to change notification settings

adityashrm21/information-theory-deep-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Information Theory in Deep Learning

This repository contains implementations (mostly in PyTorch), relevant resources and lessons related to information theory of deep learning. The aim is to have a single source for all the information on the topic.

Resources

Videos and Talks

Research Papers

[1] Naftali Tishby and Noga Zaslavsky. “Deep learning and the information bottleneck principle” IEEE Information Theory Workshop (ITW), 2015

[2] Ravid Schwartz-Ziv and Naftali Tishby. “Opening the Black Box of Deep Neural Networks via Information” ICRI-CI, 2017

[3] Naftali Tishby, Fernando C. Pereira, and William Bialek. "The information bottleneck method"

[4] Mohamed Ishmael Belghazi, Aristide Baratin, Sai Rajeswar, Sherjil Ozair, Yoshua Bengio, Aaron Courville, R Devon Hjelm, "Mutual Information Neural Estimation" ICML, 2018

[5] Ben Poole, Sherjil Ozair, Aaron van den Oord, Alexander A. Alemi, George Tucker1, "On variational lower bounds of mutual information", NeurIPS, 2018

[6] R Devon Hjelm, Alex Fedorov, Samuel Lavoie-Marchildon, Karan Grewal, Phil Bachman, Adam Trischler, Yoshua Bengio, "Learning deep representations by mutual information estimation and maximization", [stat.ML] 3 Oct 2018

Blog posts and Articles

About

Resources and Implementations (PyTorch) for Information Theoretical concepts in Deep Learning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published