Deep Learning in five steps
Undergraduate course, Wroclaw University, Faculty of Physics and Astronomy, 2023
Deep Learning in five steps.
Lecture 1
- 1.1. Preliminaries
- 1.2. My Works
- 1.3. Some historical remarks
- 1.4. Deep Learning (DL)
- 1.4.1. DL in AI
- 1.4.2. Machine Learning (ML)
- 1.4.3. Deep learning
- 1.4.4 What types of the problem DL solves
- 1.5. Linear regression problem
- 1.5.1 Some theory
- 1.5.2. Simple implementation
- 1.6. Keras for linear regression
Lecture 2
- 2.1 Generalities of Keras
- 2.2. Nonlinear regression - Neural Networks
- 2.3. Universal approximation theorem
- 2.4. Propagation of signal and back-propagation of error
- 2.5. The general strategies for Gradient Descent Optimization
- 2.6. The popular gradient descent algorithms
- 2.6.1. Gradient Descent with the momentum
- 2.6.2. AdaGrad
- 2.6.3. RMSProp
- 2.6.4. Adaptive Moment Estimation (Adam)
- 2.7. Predicting Boston Houce Prices - a nonlinear regression problem
- 2.8. Vanishing gradient problem
Lecture 3
- 3.1. Cross-Entropy for two clsses - Binary Cross-Entropy
- 3.2. Binary clssification: IMDB data
- 3.3. Multiple independent attributes
- 3.4. More than two classes mutually exclusive
- 3.5. MNIST a “Hello Word of Deep Learning”
- 3.6. Basics of the Convolutional Neural Network (CNN)
- 3.7. General remarks on ConvNet
- 3.8. ConvNet in Keras
- 3.9. DropOut Layer
- 3.10. Batch normalization layer
- 3.11. MNIST from ConvNet
Lecture 4
- 4.1 Getting Data From Kaggle
- 4.2. Preprocessing and data augmentation
- 4.2.1. Defining Model
- 4.2.2 Data Processing – Generator
- 4.2.3 Data Augmentation
Lecture 5
- 5.1 Pretrained models