Computer Science

Faculty of Engineering, LTH

Denna sida på svenska This page in English

Lectures and material

On this page you will find a literature list, some old exams (there are of course not that many...) and the lecture slides both from the current and previous years.


Course literature:

As stated in the course plan (see link to the right on the main page for the course), we base the course on the following three books, which we assume you will have access to:

  • Kevin P. Murphy: Machine Learning, A Probabilistic Perspective.
    MIT Press, 2012, ISBN: 9780262018029. 

  • Ian Goodfellow, Yoshua Bengio, Aaron Courville: Deep Learning.
    MIT Press, 2016, ISBN: 9780262035613. 

  • François Chollet: Deep Learning with Python.
    Manning, 2018, ISBN: 9781617294433.

Some lecture content will also refer to other books, however, since they are not formally listed in the course plan, we can not assume that you have them available as a whole in printed form. That said, quite some of the contents and related material by the authors or even the full book are available (e.g., hereherehere (and probably at other places as well), and here):

  • Aurélien Géron: Hands-On Machine Learning with Scikit-Learn and TensorFlow. Concepts, Tools, and Techniques to Build Intelligent Systems. O'Reilly Media, 2017, ISBN: 9781491962299.

  • Tom Mitchell: Machine Learning. McGraw Hill, 1997, ISBN: 0070428077.

  • David L. Poole, Alan K. Mackworth: Artificial Intelligence - Foundations of Computational Agents (2e). Cambridge University Press, 2017, ISBN: 9781107195394.

  • Richard S. Sutton and Andrew G. Barto: Reinforcement Learning - An Introduction. 
    MIT Press, 2018, ISBN: 9780262039246


In general:

Independently of the exam being closed or open book (please assume CLOSED BOOK, if not explicitly informed otherwise), you should bring a calculator. So far calculations have been possible to do by hand, but the calculator definitely helps!

If you have questions regarding a specific task, please refer to the respective teacher, they are indicated for each task!


Example exam #0 can be found here. We estimate you to need 2.5 to 3 hours to complete it under exam conditions.

The exam #1 from 2019-01-08 can be found here, a (not entirely complete) solution can be found here

The exam #2 from 2019-05-02 can be found here, a (sketchy) solution can be found here

The latest exam will be available through Canvas.


Lecture slides and other material will be made available here, most likely after each lecture. General reading advice for each lecture can be found in the syllabus. 

Jupyter notebooks and material from tutorial session by Dennis Medved and Lecture 2 (also helpful for lab 2)


Lecture no (date)Teacher Material
1 (4/11)EAT / allSlides (videos and images removed)
2 (6/11)EATSlides (code example see above)
3 (11/11)EATSlides
4 (13/11)EATSlides
5 (18/11)
6 (20/11)
7 (25/11)
8 (27/11)
9 (2/12)
10 (4/12)EATSlides (updated after lecture, hopefully with cleaned notations for distributions vs specific probabilities)
11 (9/12)EATSlides
12 (11/12)VKSlides, Literature: Sutton & BartoCsaba Szepesvari
13 (16/12)VK Slides
14 (18/12)VK Slides

Lectures 2018

"Old" lecture slides from 2018 (note that not all lectures are stil in the syllabus, but for quite some the material is probably still valid)


Lecture no (date)Teacher Material
1 (5/11)EAT / allSlides (PDF) (videos and images removed)
2 (7/11)JMSlides (PDF)
3 (12/11)JMSlides (PDF)
4 (14/11)
Decision Trees (slides by Eric Eaton),
Ensemble methods (slides by Eric Eaton)
5 (19/11)
6 (21/11)
7 (26/11)
8 (28/11)
9 (3/12)
10 (5/12)
11 (10/12)VK
12 (12/12)VK
13 (17/12)EAT Slides (PDF) 
14 (19/12)EAT Slides (PDF)