Skip to content

schutera/DeepLearningLecture_Schutera

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Introduction Image

I love virtual coffee: https://www.paypal.me/MarkSchutera

Deep Learning Lecture

In this repository you will find Deep Learning Lecture Material. Consider this whole lecture a living and vivid thing: An evolving draft with regular over the air updates. Feel free to contribute directly, give feedback, report errors and link additional material (see contact at the bottom). So far covered:

  • Lecture notes
  • Lecture slides
  • Jupyter notebook tutorials and excercises
  • Flashcards
  • Literature

Lecture notes

Here you will find a draft version of the lecture notes (not available yet) and the lecture slides, feel free to contribute and fix any errors, typoes and mistakes you might find - thanks. During the lecture second screen interaction will be available through sli.do (get the app here: https://www.sli.do/)

  1. Introduction and Deep Learning Foundations
  2. Transfer Learning and Object Detection
  3. Segmentation Networks
  4. Deep Reinforcement Learning
  5. Generative Adversarial Neural Networks

Jupyter notebook tutorials and excercises

  1. Backpropagation and an Introduction to Tensorflow
  2. Transfer Learning with Tensorflow for Object Classification
  3. Segmentation with U-Net
  4. Deep-Q Reinforcement Learning with the OpenAI gym
  5. Generative Adversarial Neural Networks on MNIST
  6. Recurrent Neural Networks for Language Modelling and Generation

Flashcards

These flashcards are based on Anki (get the app here: https://apps.ankiweb.net/). The flashcards revolve around fundamentals of deep learning (build based on: http://www.deeplearningbook.org). When memorized and understood, the content will kick-start you for the course and examination. The following Anki decks are ready to be imported into your app:

  1. DeepLearningLecture:Introduction
  2. DeepLearningLecture:LinearAlgebra
  3. DeepLearningLecture:ProbabilityAndInformationTheory (tbd)
  4. DeepLearningLecture:NumericalComputation
  5. DeepLearningLecture:MachineLearningBasics (tbd)
  6. DeepLearningLecture:DeepFeedforwardNetworks (tbd)
  7. DeepLearningLecture:RegularizationForDeepLearning (tbd)
  8. DeepLearningLecture:OptimizationForTrainingDeepModels (tbd)
  9. DeepLearningLecture:ConvolutionalNetworks (tbd)
  10. DeepLearningLecture:RecurrentNetworks (tbd)
  11. DeepLearningLecture:PracticalMethodology (tbd)

References and Ressources

Find a set of references to ressources in the field. Each entry contains an abstract, a link and sometimes .pdf or .epub files.

[1] Deep Learning - Ian Goodfellow and Yoshua Bengio and Aaron Courville
Abstract: The Deep Learning textbook is a resource intended to help students and practitioners enter the field of machine learning in general and deep learning in particular. The online version of the book is now complete and will remain available online for free.
Link: http://www.deeplearningbook.org/

[2] Pattern Recognition and Machine Learning - Bishop Abstract: Thisnewtextbookreflectstheserecentdevelopmentswhileprovidingacomprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machinelearningconcepts. Link: http://users.isr.ist.utl.pt/~wurmd/Livros/school/Bishop%20-%20Pattern%20Recognition%20And%20Machine%20Learning%20-%20Springer%20%202006.pdf

[3] Deep Learning and Machine Learning Courses on Coursera - Andrew Ng Abstract: In these courses, you will learn the foundations of Deep Learning, understand how to build neural networks, and learn how to lead successful machine learning projects. You will learn about Convolutional networks, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, and more. You will work on case studies from healthcare, autonomous driving, sign language reading, music generation, and natural language processing. You will master not only the theory, but also see how it is applied in industry. You will practice all these ideas in Python and in TensorFlow, which we will teach.
Link: https://www.coursera.org/courses?query=andrew%20ng

[4] CS231n: Convolutional Neural Networks for Visual Recognition - Andrej Karpathy Abstract: Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification, localization and detection. Recent developments in neural network (aka “deep learning”) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems. This course is a deep dive into details of the deep learning architectures with a focus on learning end-to-end models for these tasks, particularly image classification. During the 10-week course, students will learn to implement, train and debug their own neural networks and gain a detailed understanding of cutting-edge research in computer vision. The final assignment will involve training a multi-million parameter convolutional neural network and applying it on the largest image classification dataset (ImageNet). We will focus on teaching how to set up the problem of image recognition, the learning algorithms (e.g. backpropagation), practical engineering tricks for training and fine-tuning the networks and guide the students through hands-on assignments and a final course project. Much of the background and materials of this course will be drawn from the ImageNet Challenge. Link Lecture Series: https://www.youtube.com/watch?v=NfnWJUyUJYU&list=PLkt2uSq6rBVctENoVBg1TpCC7OQi31AlC Link Lecture Notes: http://cs231n.stanford.edu/

[5] The Missing Semester of Your CS Education Abstract: Classes teach you all about advanced topics within CS, from operating systems to machine learning, but there’s one critical subject that’s rarely covered, and is instead left to students to figure out on their own: proficiency with their tools. We’ll teach you how to master the command-line, use a powerful text editor, use fancy features of version control systems, and much more! Students spend hundreds of hours using these tools over the course of their education (and thousands over their career), so it makes sense to make the experience as fluid and frictionless as possible. Mastering these tools not only enables you to spend less time on figuring out how to bend your tools to your will, but it also lets you solve problems that would previously seem impossibly complex. Link Lecture Series: https://missing.csail.mit.edu/

[6] Coffee Table Solutions - full disclosure: I am an Author of this one. Abstract: The neural network was well into training for 42 hours. It all looked good - the gradients were flowing, the weights were updating, and the loss was decreasing. But then came the predictions for validation - all zeroes, no pattern recognized. "What did I do wrong?" — I asked my computer, who didn't answer. I noticed the dull feeling of despair and hopelessness rise inside my chest. After some more debugging and wasting more of the precious working hours, I would usually rededicate myself to the books, tutorials, and courses I knew so well by then. Somewhere had to be a hint to what I was missing - there was not. To give me a boost and to stay well caffeinated, I would usually go and grab a coffee in the coffee kitchen. Standing there at the coffee table with other students, researchers, developers, and practitioners, I would soon find myself indulging in the soft, warm feeling of getting this issue off my chest. And this is where the magic happened. Either someone already had a similar issue and knew how to fix it. Someone had an idea of narrowing down what the issue's root actually is about. Someone realized there was a conceptional problem in the data set or the model. Most of the time, what was shared at the coffee table were vague hints, interpretations and ideas, heuristics, best practices, experiences from applying deep learning in research and development. Over time we realized that some pitfalls, issues, and knowledge gaps were reoccurring in deep learning novices and were regularly brought to the coffee table while these students developed into deep learning engineers. Shortly after, we were taking notes. Throughout many sessions at the coffee table and hours of debugging, training, and evaluating deep learning approaches, researching and applying state-of-the-art concepts, we polished and enriched these notes with the great ideas that have been around and extensive references for further reading, based on our own experience. The result is this book. We hope it will be of use to you, too. Link to Kindle Book: https://www.amazon.de/dp/B09QRGWWZP

Acknowledgements

The content of this repo has been used and build up for lectures at the Ravensburg-Weingarten University and Karlsruhe Institute of Technology. Thanks for providing me with the opportunity to do so.

Special Kudos go to ..

.. Hendrik Vogt (KIT), for contributing to the Tutorials by conceptual and implementation work.

.. Martin Lanz (RWU), Himanshu Anjaparavanda Kalappa (RWU) for contributing a part of the anki flashcards.

.. Frank Hafner (ZF) for hinting at great ressources and contributing with valuable ideas.

Contact

Contact: Mark.Schutera@gmail.com
Google Scholar: https://scholar.google.de/citations?user=jFrk3WoAAAAJ&hl=de
LinkedIn: https://de.linkedin.com/in/schuteramark

About

Deep Learning Lecture Material

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published