This is a undergraduate-level introductory course in machine learning (ML) which will give a broad overview of many concepts and algorithms in ML, ranging from supervised learning methods such as support vector machines and decision trees, to unsupervised learning (clustering and factor analysis). The goal is to provide students with a deep understanding of the subject matter and skills to apply these concepts to real world problems. The course is taught by Erkut Erdem - the teaching assistant is Sibel Kapan.
Lectures: Mondays at 11:30-12:30 (D4) and Wednesdays 09:40-11:30 (D1)
Tutorials: Tutorials: Wednesdays at 15:40-17:30 (Computer Lab)
Policies: All work on assignments must be done individually unless stated otherwise. You are encouraged to discuss with your classmates about the given assignments, but these discussions should be carried out in an abstract way. That is, discussions related to a particular solution to a specific problem (either in actual code or in the pseudocode) will not be tolerated.
In short, turning in someone else’s work, in whole or in part, as your own will be considered as a violation of academic integrity. Please note that the former condition also holds for the material found on the web as everything on the web has been written by someone else.
The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through ed. Please enroll it by following the link https://edstem.org/eu/join/KXqbx8 using your departmental account.
AIN311 is a mandatory course for third-year undergraduate students who enrolled in Artificial Intelligence Engineering program. The prerequisites for this course are:
Grading for AIN311 will be based on
Grading for AIN313 will be based on
Date | Topic | Notes |
Oct 2 | Course outline and logistics, An overview of Machine Learning [slides] | Reading: The Discipline of Machine Learning, Tom Mitchell Video 1: The Master Algorithm, Pedro Domingos Video 2: The Thinking Machine Tutorial: Python/numpy |
Oct 4 | Machine Learning by Examples, Nearest Neighbor Classifier [slides] | Reading: Barber 1,14.1-14.2 Demo: k-Nearest Neighbors |
Oct 9 | Kernel Regression, Distance Functions, Curse of Dimensionality | |
Oct 11 | Linear Regression, Generalization, Model Complexity, Regularization | Assg1 out |
Oct 16 | Machine Learning Methodology | |
Oct 18 | Learning Theory, Basic Probability Review | |
Oct 23 | Statistical Estimation: MLE | |
Oct 25 | Statistical Estimation: MAP, Naïve Bayes Classifier | Assg1 due |
Oct 30 | Logistic Regression, Discriminant vs. Generative Classification | |
Nov 1 | Linear Discriminant Functions, Perceptron | Assg2 out |
Nov 6 | Multi-layer Perceptron | Course project proposal due |
Nov 8 | Training Neural Networks: Computational Graph, Back-propagation | |
Nov 13 | Introduction to Deep Learning | |
Nov 15 | Deep Convolutional Networks | Assg2 due |
Nov 20 | Support Vector Machines (SVMs) | |
Nov 22 | Soft margin SVM, Multi-class SVM | |
Nov 27 | Midterm review | |
Nov 29 | Midterm exam | Assg3 out |
Dec 4 | Kernels, Kernel Trick for SVMs, Support Vector Regression | |
Dec 6 | Decision Tree Learning | |
Dec 11 | Ensemble Methods: Bagging, Random Forests | |
Dec 13 | Ensemble Methods: Boosting | Assg3 due |
Dec 18 | Clustering: K-Means | Project progress reports due |
Dec 20 | Clustering: Spectral Clustering, Agglomerative Clustering | |
Dec 25 | Dimensionality Reduction: PCA, SVD | |
Dec 27 | Dimensionality Reduction: ICA, Autoencoders | |
Jan 1 | No class - The first day of the year | |
Jan 3 | Project presentations, Course wrap-up | Final project reports due |