2024
Syllabus
Course supervisor: | Prof. Ing. Wanda Benešová, PhD |
Course lecturers: | Ing. Igor Jánoš Ing. Tibor Sloboda Ing. Matej Halinkovic |
Supervising department: | |
Course objective: | After completing the course, the student will understand the basic principles of connectionism (of artificial neural networks), know the basic neural network models, and know how to use them for solving various problems (e.g. pattern recognition, classification, time series prediction, pattern memorization, etc.). Lectures are combined with computer modeling during exercises. |
Keywords: | deep learning, neural networks |
Form of teaching: | lecture, seminar, 3 weekly tasks, 3 projects |
Course methods: | Form of study: lectures and seminars Weekly: 2-hour lecture + 2-hour seminars |
Time allowance (lecture/seminar): | 2/3 |
Course completion: | Mode of assessment and completing the course study: credit and examination Semestral projects: min 15/35 pts Mid-term assessment: min 3/15 pts Necessary minimum 18 pts out of 60 Final assessment: final exam min 5pts out of 40 |
Mode of completion and credits: | Exam (6 credits) |
Type of study: | usual |
Taught for the form of: | full-time, attendance method |
Prerequisites for registration: | none |
Regular assessment: | theoretical questions (3%) weekly assignments (7%) semester project (35%) midterm test (15%) |
Final assessment: | final test (40%) |
- Introduction to neural networks (NN): History, inspiration from biology, basic concepts of connectionism, learning approaches, nonlinearity, back-propagation of errors
- Working with neural networks, machine learning project, training techniques, regularization, optimizers
- Convolutional neural networks, convolutional architectures, siamese networks
- Application of convolutional neural networks, localization, detection
- Recurrent neural networks, neural language processing, attention
- Vision transformers, multi-modal models, foundation models
- Self-supervised pre-training
- Introduction to generative models
- Fun with GANs, advanced generative models
- Diffusion models
- Interpretability, explainability of neural networks
- Neural radiance fields, ChatGPT
Tensorflow, PyTorch, Numpy in Exercises
Lectures 2023 – Lectures are only available in SK
1. Prednáška – Intro do predmetu, AI, MLP, Backprop (Lukáš Hudec, Igor Jánoš)
2. Prednáška – Backprop pokra?ovanie + Optimalizácia (Igor Jánoš)
3. Prednáška – Optimalizátory, Regulariza?né techniky, … (Lukáš Hudec)
4. Prednáška – Spracovanie Sekvencií, Rekurentné Siete (Lukáš Hudec)
5. Prednáška – Konvolu?né neurónové siete (CNN) (Lukáš Hudec)
6. Prednáška – CNN SOTA (Lukáš Hudec)
7. Prednáška – Segmentácia, Detekcia, a iné aplikácie (Lukáš Hudec)
8. Prednáška – Generatívne modely 1 – intro (Igor Jánoš)
9. Prednáška – Generatívne modely 2 – pokra?ovanie (Igor Jánoš)
10. Prednáška – Difúzne modely (Igor Jánoš)
11. Prednáška – Vysvetlite?nos?, XAI (Lukáš Hudec)
Lectures 2022 – Lectures are only available in SK
1st Lecture 15.02.2022 – Introduction, history, and base structures (SLP, MLP) – not recorded
2nd Lecture 22.02.2022 – Feed Forward, Backprop, Introduction to training (Igor Jánoš)
3rd Lecture 01.03.2022 – Optimization of NN, regularization techniques (Igor Jánoš)
4th Lecture 08.03.2022 – Optimizers, Optimization, Regularization
5th Lecture 15.03.2022 – Convolutional Neural Networks – SOTA
6th Lecture 22.03.2022 – CNN Modifications (AE, Siamese, …)
7th Lecture 22.03.2022 – Midterm + CNN Modifications (finishing)
8th Lecture 05.04.2022 – Generative models (VAE + GANs) (Maroš Kollár)
9th Lecture 12.04.2022 – Sequence Data, Recurrent models
10th Lecture 19.04.2022 – Reinforcement Learning (Igor Jánoš)
11th Lecture 26.04.2022 – Interpretability and Explainability
12th Lecture 02.05.2022 – DL Good practices, DL in industrial projects (Lukáš Hudec + guest Marek Šebo)
Lectures 2021
1st Lecture 16.02.2021 – Introduction and history of DL+NN
2nd Lecture 23.02.2021 – Base structures, SLP, MLP
3rd Lecture 02.03.2021 – Feed Forward, introduction to training (we apologize for the low quality of the recording)
4th Lecture 09.03.2021 – Optimization of NN, regularization techniques
5th Lecture 16.03.2021 – Optimizers, Optimization, Regularization
6th Lecture 23.03.2021 – Finishing Regularization, Convolutional Neural Networks
7th Lecture 30.03.2021 – SOTA CNNs, CNN architecture modifications
8th Lecture 06.04.2021 – Sequences, RNN, LSTM, Transformer networks
9th Lecture 13.04.2021 – (Midterm) Generative models – VAE, CNN
10th Lecture 20.04.2021 – Generative models – GANs, Reinforcement Learning
11th Lecture 27.04.2021 – Visualization of NN, Interpretability and Explainability
12th Lecture 04.05.2021 – Best practices for DL projects, DL debugging
The semester is over, all lectures are publicly here, all that remains is the final exam. Good Luck on 4.6.2021.
Seminars 2021
Syllabus
Course supervisor: | Doc. Ing. Wanda Benešová, PhD |
Course lecturers: | Ing. Lukáš Hudec Ing. Igor Jánoš Ing. Štefan Grivalský |
Supervising department: | |
Course objective: | After completing the course, the student will understand the basic principles of connectionism (of artificial neural networks), know the basic neural network models, and know how to use them for solving various problems (e.g. pattern recognition, classification, time series prediction, pattern memorization, etc.). Lectures are combined with computer modeling during exercises. |
Keywords: | deep learning, neural networks |
Form of teaching: | lecture, seminar, 3 weekly tasks, 3 projects |
Course methods: | Form of study: lectures and seminars Weekly: 2-hour lecture + 2-hour seminars |
Time allowance (lecture/seminar): | 2/3 |
Course completion: | Mode of assessment and completing the course study: credit and examination Weekly assignments: min 2/7 pts Semestral projects: min 11/38 pts Mid-term assessment: min 3/15 pts Necessary minimum 16 pts out of 60 Final assessment: final exam min 5pts out of 40 |
Mode of completion and credits: | Exam (6 credits) |
Type of study: | usual |
Taught for the form of: | full-time, attendance method |
Prerequisites for registration: | none |
Regular assessment: | weekly assignments (7%) semester project (38%) midterm test (15%) |
Final assessment: | final test (40%) |
- Lectures1. Introduction to neural networks (NN): History, inspiration from biology, basic concepts of connectionism, learning approaches, nonlinearity.
- Structures of neural networks: binary/continuous perceptron, MLP, activation functions, types of architectures.
- Optimization algorithms and learning: classification, regression, forward propagation and vectorization, the algorithm of backward error propagation, SGD. Visualization tool for monitoring of training NN.
- Optimization algorithms (2): implementation
- Optimization and regularization: recommended practices.
- Convolutional neural networks.
- Generative models and other modifications of CNN.
- Networks with memory, recursive models, models with attention.
- Reinforcement learning.
- Prerequisites for the use of NN in practice: building intuition and NN mindset, best practices, troubleshooting.
- Visualization and Interpretability of decisions: visualization of weights and activations, interpretability, explainability of decisions.
- Other architectures and NN use-cases.
Tensorflow, PyTorch, Numpy in Exercises