Course Overview
Day 0: Pre-requisites
As part of pre-requisites, a 2-hour video would be shared having following details:
- Installing Python, and other packages/software required for the course
- Introduction to Python
Day 1: Introduction to the AI/ML (Monday)
Duration: 4 hours Theory + 4 hours Lab
Objectives of the day:
- Understand elements of statistical learning.
- Build Linear/Non-linear hypotheses.
- Use Loss Functions for regression and classification problems such as least squares, cross-entropy, etc.
- Find solutions using gradient descent.
- Understand issues with Bias & Variance.
- Understand Regularization concepts.
- In the lab
- Perform pre-processing steps on a given dataset
- Build a regression/classification model, with regularization
- Report the error metrics
Day 2: Shallow Learning Fundamentals (Tuesday)
Duration: 4 hours Theory + 4 hours Lab
Objectives of the day:
- Understand Neural Networks (NN) Basics
- Deep dive into Perceptron concepts, and limitation
- Get to know about Back Propagation, and how Gradient Descent is used in Back Propagation
- Learn practical ways of building Shallow Networks
- Understand Best Practices, and application to real-world problems.
- Learn Neural nets for word2vec representations.
- In the Lab:
- 10-15 minute simple MCQ based quiz (ROTe – Recall Only Test) on topics covered on Day 1 of Week 1
- Work on the same dataset as used on Day 1, and achieve better accuracies
Day 3: Deep Learning - Multi-Layered Perceptron (Wednesday)
Duration: 4 hours Theory + 4 hours Lab
Objectives of the day:
- Get to know issues in deepening the nets and techniques to overcome these issues.
- Learn Deep Learning(DL) Basics.
- Deep dive into Regularization, auto-encoders, RELU activation, hyper-parameter tuning and transfer learning.
- Hyper parameter tuning
- In the Lab:
- 10-15 minute simple MCQ based quiz (ROTe – Recall Only Test) on topics covered on Day 2 of Week 1
- Learn to remove noise in data, use NN as feature generator for other models, or use NN as a predictor
- Unsupervised learning using NN. Take a high dimensional data and reduce the dimensionality, and apply clustering
Day 4: Convolution Neural Net - for images (Thursday)
Duration: 4 hours Theory + 4 hours Lab
Objectives of the Day:
- Start with Architecting a Convolution Neural Network (CNN)
- Learn the Geometry of CNN
- Understand practical aspects of building CNN
- Data augmentation
- Object Localization
- How to visualize a convolution net
- Discuss limitations of Deep Neural Networks, Architecture
- In the Lab:
- 10-15 minute simple MCQ based quiz (ROTe – Recall Only Test) on topics covered on Day 3 of Week 1
- Build CNN, step-by-step, with CIFAR dataset (including augmentation, ensemble)
- CNN for text (based on latest research paper)
Day 5: Recurrent Neural Net - for Text (Friday)
Duration: 4 hours Theory + 4 hours Lab
Objectives of the Day:
- Learn basics of Recurrent Neural Networks (RNN)
- Understand architectural differences
- Deep dive into Long Short Term Memory (LSTM) nets for text mining and time series
- Learn how to scale up Deep Neural Network, and other issues
- In the Lab:
- 10-15 minute simple MCQ based quiz (ROTe – Recall Only Test) on topics covered on Day 1 of Week 2
- Build RNN for Entity Extraction
- Build RNN for Sentiment Classification/Analysis
Final Project submission & evaluation (remote work in the 2nd week)
Objective:
- ~30 hours during the Project, with primary focus on
- Pre-Process
- Build a forecasting or regression model
- Image Captioning using CNN and RNN
- An online MCQ based test with 50 Min, 30 questions – 5 PM to 6 PM
- Submit final project paper the following Friday
* * * * * Graduation * * * * *
Infrastructure access provided:
- 3 Deep learning servers each has 512 GB of Ram, 4 TB of storage and 40 vCPU’s
- Each server has 4 * GEFORCE GTX 1080 Ti cards
- Each graphics card is packed with extreme gaming horsepower, next-gen 11 Gbps GDDR5X memory, and a massive 11 GB frame buffer
- Users will connect through VPN tunnel to our network/ server to work on projects