Machine Learning (AIS): Module I

General information

Degree: Master of Science in Artificial Intelligence Systems
Period: September - December

Objectives

The course aims at studying the fundamentals of machine learning, covering supervised and unsupervised learning methods. It includes application examples as well as laboratory exercises. At the end of the course, it is expected that the student will acquire the skills useful to design methods and tools for the analysis of signals and data.

Prerequisites (suggested)

Linear algebra, probability theory. Boolean algebra, knowledge of a programming language. For a good introduction to linear algebra see: Gilber Strang, Introduction to Linear Algebra, Wellesley-Cambridge Press, 2016.

Content

The second part of module I (taught by me) covers the following topics: linear Discriminant Analysis (perceptron, least-square regression, SVM); kernel machines (kernel methods, kernels, kernels on structures); learning Bayesian networks (parameter learning, structure learning); reinforcement learning; unsupervised learning.

Mode

Teaching for this course is in blended mode.

  • Students enrolled in the first year of the Master of Science in Artificial Intelligence Systems can physically attend the course.
  • All other students should follow remotely. The details of the Zoom meeting are available on the Moodle community.

Course Information (part taught by me)

Instructor: Andrea Passerini
Email:
Teaching assistant: Giovanni Pellegrini
Email: giovanni.pellegrini@unitn.it
Office hours: Arrange by email
Lecture time: Monday 11:30-13:30
Wednesday 9:30-11:30
Communications: Please check the moodle page of the course for news and updates.
Bibliography: R.O. Duda, P.E. Hart and D.G. Stork, Pattern Classification (2nd edition), Wiley-Interscience, 2001.
D. Koller and N. Friedman, Probabilistic Graphical Models, The MIT Press, 2009
J.Shawe-Taylor and N. Cristianini, Kernel Methods for Pattern Analysis, Cambridge University Press, 2004.
I. Goodfellow, Y. Bengio and A. Courville, Deep Learning, The MIT Press, 2016 (online version available here).
Slides: Bayesian Networks [slides] [handouts]
Learning BN [slides] [handouts]
Bayesian Network lab [slides] [data] [software]
Linear discriminant functions [slides] [handouts]
Support Vector Machines [slides] [handouts]
Non-linear Support Vector Machines [slides] [handouts]
Kernel Machines [slides] [handouts]
Scikit-learn lab [slides] [material]
Clustering [slides] [handouts]
Reinforcement learning [slides] [handouts]
Videos: Registered lectures made available on Moodle (on a weekly basis)
Additional material: Linear algebra [slides] [handouts]
Probability theory [slides] [handouts]

Exams

Modality: Oral examination.