Contains material relevant to "Deep Into CNN" Project.
- Local Setup (Use Conda : recommended)
https://jupyter.readthedocs.io/en/latest/install/notebook-classic.html
https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html#installation - (Optional: Basic Python and libraries)
https://duchesnay.github.io/pystatsml/index.html#scientific-python - ( Optional : For those with very basic ml knowledge: Only 2.1-2.7)
https://www.youtube.com/watch?v=PPLop4L2eGk&list=PLLssT5z_DsK-h9vYZkQkYNWcItqhlRJLN - Linear Regression:
https://medium.com/analytics-vidhya/simple-linear-regression-with-example-using-numpy-e7b984f0d15e - Logistic Regression:
https://towardsdatascience.com/logistic-regression-detailed-overview-46c4da4303bc
Find in NeuralNetIntro : W2-3.
- This one is highly recommended:
https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi
Some more material (bit extensive, so be careful):
https://youtube.com/playlist?list=PLtBw6njQRU-rwp5__7C0oIVt26ZgjG9NI - Basic Backprop:
https://ml-cheatsheet.readthedocs.io/en/latest/backpropagation.html - Backprop (Mathematical Version):
https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/ - Softmax:
https://ljvmiranda921.github.io/notebook/2017/08/13/softmax-and-the-negative-log-likelihood/ - Pytorch(Skip the CNN part if you want for now):
https://pytorch.org/tutorials/beginner/basics/intro.html - Optional guide:
http://neuralnetworksanddeeplearning.com/chap1.html - Pytorch Autograd:
https://www.youtube.com/watch?v=MswxJw-8PvE&list=PL-bzqKhHrboYIKgBwoqzl6-eyCHP3aBYs&index=4
Find in PyTorch : W2-3.
June 1 - June 30 :
https://www.kaggle.com/c/tabular-playground-series-jun-2021
Find Sample Submission Here
- These will give you a good Intuition:
https://www.youtube.com/watch?v=py5byOOHZM8
and
https://www.youtube.com/watch?v=BFdMrDOx_CM
Also, do check this blog out:
https://towardsdatascience.com/a-comprehensive-guide-to-convolutional-neural-networks-the-eli5-way-3bd2b1164a53 - Highly Recommended:
https://cs231n.github.io/convolutional-networks/ - (L2-L11 : Enough for Understanding Implementation Details):
https://www.youtube.com/playlist?list=PLkDaE6sCZn6Gl29AoE31iwdVwSG-KnDzF - Pytorch official guide (Try after doing W3 exercises):
https://pytorch.org/tutorials/beginner/basics/intro.html
Find in W3 Folder
Choose Any 1 of following:
- AlexNet:
https://papers.nips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf - VGG:
https://arxiv.org/pdf/1409.1556v6.pdf - Inception(GoogLeNet)*:
https://arxiv.org/pdf/1409.4842v1.pdf - Xception*:
https://openaccess.thecvf.com/content_cvpr_2017/papers/Chollet_Xception_Deep_Learning_CVPR_2017_paper.pdf - ResNet*:
https://arxiv.org/pdf/1512.03385v1.pdf
(* Recommended)
- Inception Module:
https://towardsdatascience.com/deep-learning-understand-the-inception-module-56146866e652 - Separable Convolutions:
https://towardsdatascience.com/a-basic-introduction-to-separable-convolutions-b99ec3102728 - Implementation of Xception :
Use
groups
argument of conv2d for separating channels (i.e. for Depthwise Separable Convolution ):
https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html
Based on following Dataset:
https://www.kaggle.com/gpiosenka/100-bird-species
-
Visualizing MNIST (Casual Reading, Enjoy Animations):
http://colah.github.io/posts/2014-10-Visualizing-MNIST/ -
Optimizers: Only Gradient Descent Variations, Adam and RMSProp:
https://ruder.io/optimizing-gradient-descent/ -
SGD with Momentum(Mathematical,For future reference)
https://distill.pub/2017/momentum/ -
Weight Initialization:
https://towardsdatascience.com/weight-initialization-techniques-in-neural-networks-26c649eb3b78 -
Batch Norm :
https://towardsdatascience.com/batch-normalization-in-3-levels-of-understanding-14c2da90a338 -
Overfitting, Regularization, Hyper-parameter tuning :
http://neuralnetworksanddeeplearning.com/chap3.html#how_to_choose_a_neural_network%27s_hyper-parameters -
Complete Reference(Videos):
https://www.youtube.com/playlist?list=PLkDaE6sCZn6Hn0vK8co82zjQtt3T2Nkqc