Menu

SDS 408: Machine Learning

Course Title

Machine Learning

Course Code

SDS 408

Course Type

Mandatory

Level

Master’s

Year / Semester

1st Semester

Instructor’s Name

Assoc. Prof. Mihalis Nicolaou

ECTS

10

Lectures / week

2

Laboratories / week

1

Course Purpose and Objectives

The aim of this course is to provide a broad introduction on both theoretical as well as practical concepts in machine learning and pattern recognition. Topics include fundamental machine learning concepts and algorithms, such as supervised learning (parametric and non-parametric algorithms, classification and regression, discriminative and generative learning), unsupervised learning (clustering, dimensionality reduction, data imputation), and learning theory (bias-variance tradeoff, curse of dimensionality).  The course also includes practical advice for designing machine learning systems, as well as an overview of modern applications of machine learning.

Learning Outcomes

  • Demonstrate an advanced critical understanding of fundamental concepts in machine learning
  • Understanding of the properties of a broad set of basic machine learning algorithms
  • Select, implement, and apply the appropriate algorithms for given tasks and datasets
  • Choose the appropriate feature representations for specific types of data, and the appropriate representation learning technique to perform feature extraction
  • Rigorously evaluate the performance of machine learning algorithms on target datasets
  • Use the Python programming language in machine learning applications.

Prerequisites

 

Requirements SDS 403

Course Content

W1: Introduction to Machine Learning.  Review of basic concepts and basic mathematics (e.g., linear algebra, probability theory)types of machine learning, real-world examples.

W2: Learning theory.  Nearest-neighbors, curse of dimensionality, bias-variance tradeoff, overfitting, model capacity, Bayes optimal classifier.  Supervised, Unsupervised, and Reinforcement Learning.

W3-W4: Regression (Supervised Learning). Linear regression and gradient descent for convex optimization.  Non-linear/polynomial regression. Least-squares formulation and normal equations.  Evaluation of machine learning algorithms (confusion matrix, metrics, recall/precision).

W5-W6: Classification (Supervised Learning). Logistic regression for classification (binary, multi-class).  Convex cost function and gradient descent.  Linear and non-linear decision boundaries. Softmax regression for multi-class classification. Other classification approaches (e.g., SVM)

W7: Regularization and Model Selection.  L1 (Lasso) and L2 (Ridge) regularization, probablistic view with Laplace and Gaussian priors.  Linear optimization with constraints. Regularization in Neural Networks.

W8: Unsupervised Learning.  Dimensionality Reduction (e.g., Principal Component Analysis), Clustering, Gaussian Mixtures.

W9: Probabilistic Learning and Statistical Estimation.  Maximum Likelihood, Maximum-a-posteriori, Naïve  Bayes,

W10: Introduction to Neural Networks. Backpropagation and gradient descent for optimization.  Activation functions, properties, derivatives.

W11-12: Introduction to Deep Learning.  Commonly used layers and architectures (e.g., convolutional, attention-based).  Practicals on developing and deploying deep learning models.

W13:  Advanced topics in Machine Learning (e.g., HPC and ML, state-of-the-art applications and research developments)

W14. Course Synopsis and Revision

Teaching Methodology

Lectures, Labs

Bibliography

  • C. Bishop, “Pattern Recognition and Machine Learning”, ISBN: 978-0-387-31073-2
  • G. James, D. Witten, T. Hastie and R. Tibshirani, “An Introduction to Statistical Learning”, ISBN-13: 978-1461471370
  • Machine Learning: A Probabilistic Perspective (Kevin Muprhy, MIT Press)
  • Machine Learning Refined: Foundations, Algorithms and Applications (Aggelos Katsaggelos et al., Cambridge University Press)

Assessment

Coursework/Exam/ Project

Language

English

Publications & Media