SubjectsSubjects(version: 945)
Course, academic year 2023/2024
   Login via CAS
Deep Learning - NPFL114
Title: Hluboké učení
Guaranteed by: Institute of Formal and Applied Linguistics (32-UFAL)
Faculty: Faculty of Mathematics and Physics
Actual: from 2023
Semester: summer
E-Credits: 7
Hours per week, examination: summer s.:3/2, C+Ex [HT]
Capacity: unlimited
Min. number of students: unlimited
4EU+: no
Virtual mobility / capacity: no
State of the course: not taught
Language: Czech, English
Teaching methods: full-time
Teaching methods: full-time
Additional information: http://ufal.mff.cuni.cz/courses/npfl114
Guarantor: RNDr. Milan Straka, Ph.D.
Incompatibility : NPFL138
Interchangeability : NPFL138
Is incompatible with: NPFL138
Is interchangeable with: NPFL138
Annotation -
Last update: doc. Mgr. Barbora Vidová Hladká, Ph.D. (25.01.2019)
In recent years, deep neural networks have been used to solve complex machine-learning problems. They have achieved significant state-of-the-art results in many areas. The goal of the course is to introduce deep neural networks, from the basics to the latest advances. The course focuses both on theory as well as on practical aspects (students will implement and train several deep neural networks capable of achieving state-of-the-art results, for example in image recognition, 3d object recognition, speech recognition, image generation or playing video games).
Aim of the course -
Last update: RNDr. Milan Straka, Ph.D. (19.01.2018)

The goal of the course is to introduce deep neural networks, from the basics to the latest advances. The course will focus both on theory as well as on practical aspects.

Course completion requirements -
Last update: RNDr. Milan Straka, Ph.D. (05.06.2018)

Students pass the practicals by submitting sufficient number of assignments. The assignments are announced regularly the whole semester and are due in several weeks. Considering the rules for completing the practicals, it is not possible to retry passing it. Passing the practicals is not a requirement for going to the exam.

Literature -
Last update: T_UFAL (25.04.2016)

Yoshua Bengio, Ian Goodfellow, Aaron Courville: Deep learning, MIT Press, In preparation.

Jürgen Schmidhuber: Deep learning in neural networks: An overview, Neural networks 61 (2015): 85-117.

Sepp Hochreiter, and Jürgen Schmidhuber: Long short-term memory, Neural computation 9.8 (1997): 1735-1780.

Requirements to the exam -
Last update: RNDr. Milan Straka, Ph.D. (15.06.2020)

The exam is written and consists of questions randomly chosen from a publicly known list. The requirements of the exam correspond to the course syllabus, in the level of detail which was presented on the lectures.

Syllabus -
Last update: RNDr. Milan Straka, Ph.D. (10.05.2020)

Feedforward deep neural networks

  • Basic architectures and activation functions
  • Optimization algorithms for training deep models

Regularization of deep models

  • Classic regularization using parameter norm penalty
  • Dropout
  • Label smoothing
  • Batch normalization
  • Multi-task learning

Convolutional neural networks

  • Convolutional and pooling layers
  • Architectures suitable for very deep convolutional networks
  • State-of-the-art models for image recognition, object localization and image segmentation
  • Pre-training and finetuning of deep neural networks

Recurrent neural networks

  • Basic recurrent network, specifics of training
  • Long short-term memory
  • Gated recurrent units
  • Bidirectional and deep recurrent networks
  • Encoder-decoder sequence-to-sequence architectures

Practical methodology

  • Choosing suitable architecture
  • Hyperparameter selection

Natural language processing

  • Distributed word representations
  • Character-level word embeddings
  • Transformer architecture
  • State-of-the-art POS tagging, named entity recognition, machine translation, image labeling

Deep generative models

  • Variational autoencoders
  • Generative adversarial networks
  • Speech generation

Structured prediction

  • CRF layer
  • CTC loss and its application in state-of-the-art speech recognition

Introduction to deep reinforcement learning

Neural networks with external memory

Entry requirements -
Last update: doc. RNDr. Vladislav Kuboň, Ph.D. (05.06.2018)

Basic programming skills in Python are required. No previous knowledge of artificial neural networks is needed, but basic understanding of machine learning is advisable.

 
Charles University | Information system of Charles University | http://www.cuni.cz/UKEN-329.html