Neural Networks

Lectures

1st Lecture 16.02.2021 – Introduction and history of DL+NN
2nd Lecture 23.02.2021 – Base structures, SLP, MLP
3rd Lecture 02.03.2021 – Feed Forward, introduction to training (we apologize for the low quality of the recording)
4th Lecture 09.03.2021 – Optimization of NN, regularization techniques
5th Lecture 16.03.2021 – Optimizers, Optimization, Regularization
6th Lecture 23.03.2021 – Finishing Regularization, Convolutional Neural Networks
7th Lecture 30.03.2021 – SOTA CNNs, CNN architecture modifications
8th Lecture 06.04.2021 – Sequences, RNN, LSTM, Transformer networks
9th Lecture 13.04.2021 – (Midterm) Generative models – VAE, CNN
10th Lecture 20.04.2021 – Generative models – GANs, Reinforcement Learning
11th Lecture 27.04.2021 – Visualization of NN, Interpretability and Explainability
12th Lecture 04.05.2021 – Best practices for DL projects, DL debugging

Tuesday, 14:00 – 15:50    MS Teams

Seminars 2021

Syllabus

Course supervisor: Doc. Ing. Wanda Benešová, PhD
Course lecturers: Ing. Lukáš Hudec
Ing. Igor Jánoš
Ing. Štefan Grivalský
Supervising department:

Institute of Computer Engineering and Applied Informatics (UPAI FIIT)

Course objective: After completing the course, the student will understand the basic principles of connectionism (of artificial neural networks), know the basic neural network models, and know how to use them for solving various problems (e.g. pattern recognition, classification, time series prediction, pattern memorization, etc.). Lectures are combined with computer modeling during exercises.
Keywords: deep learning, neural networks
Form of teaching: lecture, seminar, 3 weekly tasks, 3 projects
Course methods: Form of study: lectures and seminars
Weekly: 2-hour lecture + 2-hour seminars
Time allowance (lecture/seminar): 2/3
Course completion: Mode of assessment and completing the course study: credit and examination
Weekly assignments: min 2/7 pts
Semestral projects: min 11/38 pts
Mid-term assessment: min 3/15 pts
Necessary minimum 16 pts out of 60

Final assessment: final exam min 5pts out of 40
Mode of completion and credits: Exam (6 credits)
Type of study: usual
Taught for the form of: full-time, attendance method
Prerequisites for registration: none
Regular assessment: weekly assignments (7%)
semester project (38%)
midterm test (15%)
Final assessment: final test (40%)
  1. Lectures1. Introduction to neural networks (NN): History, inspiration from biology, basic concepts of connectionism, learning approaches, nonlinearity.
  2. Structures of neural networks: binary/continuous perceptron, MLP, activation functions, types of architectures.
  3. Optimization algorithms and learning: classification, regression, forward propagation and vectorization, the algorithm of backward error propagation, SGD. Visualization tool for monitoring of training NN.
  4. Optimization algorithms (2): implementation
  5. Optimization and regularization: recommended practices.
  6. Convolutional neural networks.
  7. Generative models and other modifications of CNN.
  8. Networks with memory, recursive models, models with attention.
  9. Reinforcement learning.
  10. Prerequisites for the use of NN in practice: building intuition and NN mindset, best practices, troubleshooting.
  11. Visualization and Interpretability of decisions: visualization of weights and activations, interpretability, explainability of decisions.
  12. Other architectures and NN use-cases.

Tensorflow, PyTorch, Numpy in Exercises