Syllabus
Syllabus (subject to adjustments)
Please note: This syllabus is under construction. Changes may occur at any time, as long as this note is readable.
General information:
The course has 14 lectures and 7 lab sessions including 4 examination relevant assignments and an optional written exam.
The lab sessions / assignments will be conducted in pairs, exceptions can be made as exactly this: exceptions!
Attendance of Lab session 1 is compulsory to keep your seat in the course. If you do not attend and do not present a confirmation for a good reason for not attending, your place might be given to the next student on the waiting list!
The lab sessions 2 and 5 are compulsory in the sense that you will have to present your solution to the respective assignment(s) to a teacher or teaching assistant during the session. Should you not be able to participate, notify the course responsible teacher (Elin). It will be possible to show solutions in following sessions or if necessary in an extra session at the end of the course, however, this should not be the rule but an exception! Do not count on extra efforts from our part if you are not following the schedule without a justification (illness, for example).
The assignments (lab sessions) 3, 4, 6, and 7 are the main examination for the course. You will have to present your solution to the respective assignment(s) to a teacher or teaching assistant during the session, this can be handled in pairs still. In addition to that, you need to hand in anindividually written / produced report according to respective instructions after the lab session. Should you not be able to participate, notify the course responsible teacher (Elin). It will be possible to show solutions in following sessions or if necessary in an extra session at the end of the course, however, this should not be the rule but an exception! Do not count on extra efforts from our part if you are not following the schedule without a justification (illness, for example).
Passing all lab assignments will give you all 7.5 credits for the course and a grade of pass (3). A higher grade for the course on the U/3/4/5 scale can be obtained by participating successfully in the optional written exam. The exam grade will then be the final grade for the course.
We are mainly three teachers on the course, Elin A. Topp (EAT, formally course responsible, Elin_Anna.Topp at cs.lth.se), Pierre Nugues (PN, Pierre.Nugues at cs.lth.se), and Volker Krueger (VK, Volker.Krueger at cs.lth.se). Information about the course assistants can be found together with the information on labs and assignments.
course week (calendar week) | Lectures | Lab sessions | Teacher(s) | Reading advice |
---|---|---|---|---|
1 (45) | (1) Overview lecture, introduction to ML, ML types and application areas | EAT / all | Goodfellow chapter 1, Murphy, chapter 1 | |
1 (45) | (2) Fundamentals: Concept learning, Linear algebra, matrix operations, numpy, jupyter. | EAT | Goodfellow chapter 2,Chollet chapter 2, Mitchell chapter 2 | |
1 (45) | (1) Lab 1: Python, Numpy, matrices | PN | ||
2 (46) | (3) Decision trees, probability and information theory, some outlook on evaluation | EAT | Goodfellow | |
2 (46) | (4) Fundamental techniques 2: Decision trees / Random Forests, Ensemble methods, a bit of Clustering (k-Means), Instance Based Learning (k-NN)
| EAT | Mitchell, chapter 3, (6, 8); Géron ch 7; lecture slides lectures 2 and 3 from 2018 | |
2 (46) | (2) Lab 2: DTs | EAT | ||
3 (47) | (5) Fundamental machine learning techniques 3: linear regression, gradient descent, classification, perceptron, logistic regression. Loss, regularization, cost, evaluation, overfit. Neural networks, feed forward networks, backpropagation. Implementation of feed forward networks with Keras. | PN | Chollet chapters 3,4 | |
3 (47) | (6) Convolutional neural networks, application to vision. Implementation with Keras. | PN | Chollet chapter 5 | |
3 (47) | (3) Assignment 1: Image classification with CNNs | PN | ||
4 (48) | (7) Recurrent neural networks, Application to sequences. | PN | Chollet chapter 6 | |
4 (48) | (8) Other kinds of recurrent networks: LSTM, and GRU. Sequence-to-sequence learning. Application to translation. | PN | ||
4 (48) | (4) Assignment 2: Sequence tagging with RNNs | PN | ||
5 (49) | (9) Autoencoders and generative learning. Application to image generation. | PN | Chollet chapter 8 | |
5 (49) | (10) Probability Theory / Bayesian Networks and Classifiers, NCC / NBC / Gaussian Mixture Models | EAT | Lecture Slides Lectures 11, 12 2018, Goodfellow ch 3, Murphy ch. 2, Mitchell ch. 6 | |
(5) Lab 3: Sign classification with NBC | EAT | |||
6 (50) | (11) Bayesian learning, EM and k-Means | EAT | Lecture slides 4,5 EDAF70 2018, Murphy ch. 6 | |
6 (50) | (12) Markov decision processes / Intro to reinforcement learning | VK | see lecture slides under material / lectures | |
6 (50) | (6) Assignment 3: Learning an NBC with EM | EAT | ||
7 (51) | (13) Reinforcement Learning | VK | see lecture slides under material / lectures | |
7 (51) | (14) Reinforcement Learning 2, RL and robotics | VK | see lecture slides under material / lectures | |
7 (51) | (7) Assignment 4: Reinforcement learning | VK |