Skip to content

amitkaps/deep-learning

Repository files navigation

Deep Learning Bootcamp

The objective for the Deep Learning bootcamp is to ensure that the participants have enough theory and practical concepts of building a deep learning solution in the space of computer vision and natural language processing. Post the bootcamp, all the participants would be familiar with the following key concepts and would be able to apply them to a problem.

Key Deep Learning Concept

  • Theory: DL Motivation, Back-propagation, Activation
  • Paradigms: Supervised Learning
  • Models: Sequential Models, Pre-trained Models (Transfer Learning)
  • Methods: Perceptron, Dense, Convolution, Pooling, Dropouts, Recurrent, LSTM, Embedding
  • Process: Setup, Encoding, Training, Serving
  • Tools: python-data-stack, keras, tensorflow

Notebooks

Learning Resources by Authors

External References

These are reference materials which have excellent explanations - visual, interactive, math or code driven in text, video, app or notebook format - about Machine Learning and Deep Learning. We have found them useful in our own learning journey. We hope they will help you in yours.

  • Basics: Python, Numpy and Math

  • Basics of Machine Learning

  • Deep Learning Basics

    • Want a visual understanding of Deep Learning. Start with these four videos by @3blue1brown on Neural Networks. (Video, Visual)
    • Want to learn how to create a neural network? Go and play with all the knobs and options to build and train a simple neural network at Tensorflow Playground. (Website, Interactive)
    • How can neural networks compute any function? Read this visual proof by Michael Nielson in Chapter 4 in Neural Network and Deep Learning. (Text, Visual)
    • Why are simple neural networks (like MLP) hard to train? Here is a good explanation on the concept of vanishing and exploding gradients in Deep Learning - Chapter 5. (Text, Visual & Code)
  • Learning & Optimization

  • Deep Learning for Images

    • What is a Convolution? Get a basics understanding of convolution in this exemplar driven post - Understanding Convolution. (Text, Visual)
    • How do you build a Convolution Neural Network? CS231N course notes on Convolution Network is a concise read-up on this. (Text, Visual)
    • Want to play with convolutions filters? Check out the interactive explainer of convolution at ML4a demos site. (App, Interactive)
    • Need more analogies for Convolution Neural Nets? Check out this excellent post of explaining CNNs using different lenses of image processing, fluid mechanics, statistics, and information theory: Understanding Convolution in Deep Learning. (Text, Visual)
    • Why are we doing transfer learning? Here is good way to think about possible approaches to adopt for transfer learning when using CNNs. (Text)
  • Deep Learning for NLP

    • Confused by all these embedding stuff? Read this post on Representation and NLP to understand of why they are so effective in Deep Learning. (Text, Visual)
    • Want to understand word embeddings? Start with this elegant post on Word is worth a thousand vectors. (Text, Visual)
    • How does this word2vec stuff relate to statistical methods? This article with a click-bait title - Stop using word2vec will help you put all these methods in a simple framework to understand. (Text, Visual)
    • Need to deep dive more in the math of word embedding. Start with these four posts by Sebastian Ruder on word embeddings: Part 1 - Basic, Part 2 - Softmax, Part 3 - Word2Vec, Part 5 - Recent Trends. (Text, Math)
    • Why are we using Recurrent Neural Networks? Karpathy's article The Unreasonable Effectiveness of RNNs is a wonderful introduction to this topic with even code to do fun things with them. (Text, Visual & Code)
    • What are LSTMs? Start with this visual unpacking of what is happening within the LSTM node - Understanding LSTMs. (Text, Visual)
    • Still confused by all this DL text approaches? Here is an article to understand the DL process for NLP as the four steps of Embed - Encode - Attend - Predict in this post by Spacy's creator on Deep Learning Formula for NLP. (Text, Visual)
    • Want pratical steps for using Deep Learning for Text Classification? Check out how to build a DL model and consolidated best practice advice from Google's Text Classification Guide. (Text, Visual & Code)
    • Doing more exotic NLP stuff? Then check out this article on current Best approaches for Deep Learning in NLP. (Text, Math)
  • Visualisation

    • Why do we want to visualise & understand NNs? This post will give you a basic understanding of the process of visualising NNs for Human Beings - Visualising Representation. (Text, Visual)
    • Want to visualise networks and learning? Use the Tensorboard callback to start doing that from your notebooks. (App, Interactive)
    • Want to see the visualisation of DL layers? Go and check the demo's on the website of Keras.js. (App, Visual & Interactive)
    • Want to understand why all these dimensionality reduction approaches? Start by reading the interactive piece by Christopher Olah on Visualising MNIST (Text, Visual & Interactive)
    • Want to look at your embeddings in 2D/3D? Check out the embedding projector and you can run it on your own data using TensorBoard. (App, Interactive)
    • What is the Neural Network really learning in images? Check out these articles on Feature Visualisation and The Building Block of Interpretibility. (Text & Notebooks, Visual & Interactive)
  • Continue (Your) Learning on (Deep) Learning

    • Want to learn using notebooks on Deep Learning? Explore the collection of interactive ML examples at Seedbank.
    • More of a book person? My guidance for an applied book is this very practical book by François Chollet (the creator of Keras) - Deep Learning in Python