Course Information

About

This is a undergraduate-level introductory course in machine learning (ML) which will give a broad overview of many concepts and algorithms in ML, ranging from supervised learning methods such as support vector machines and decision trees, to unsupervised learning (clustering and factor analysis). The goal is to provide students with a deep understanding of the subject matter and skills to apply these concepts to real world problems. The course is taught by Erkut Erdem - the teaching assistants are Gorkem Akyildiz and Sevginur Ince.

                 

Time and Location

Lectures: Mondays at 11:40-12:30 (D2) and Tuesdays 13:40-15:30 (D9)
Tutorials: Tutorials: Wednesdays at 15:40-17:30 (D10)

Reference Books

Policies: All work on assignments must be done individually unless stated otherwise. You are encouraged to discuss with your classmates about the given assignments, but these discussions should be carried out in an abstract way. That is, discussions related to a particular solution to a specific problem (either in actual code or in the pseudocode) will not be tolerated.

In short, turning in someone else’s work, in whole or in part, as your own will be considered as a violation of academic integrity. Please note that the former condition also holds for the material found on the web as everything on the web has been written by someone else.

Communication

The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through ed. Please enroll it by following the link https://edstem.org/eu/join/nSWkR5 using your departmental account.

Pre-requisites

AIN311 is a mandatory course for third-year undergraduate students who enrolled in Artificial Intelligence Engineering program. The prerequisites for this course are:

Course Requirements and Grading

Grading for AIN311 will be based on

Grading for AIN313 will be based on

Schedule

Date Topic Notes
Sep 23 Course outline and logistics, An overview of Machine Learning [slides] Reading: The Discipline of Machine Learning, Tom Mitchell
Video 1: The Master Algorithm, Pedro Domingos
Video 2: The Thinking Machine
Sep 24 Machine Learning by Examples, Nearest Neighbor Classifier [slides] Reading: Barber 1,14.1-14.2
Demo: k-Nearest Neighbors
Sep 30 Kernel Regression, Distance Functions, Curse of Dimensionality [slides]
Oct 1 Linear Regression, Generalization, Model Complexity, Regularization [slides] Reading: Bishop 1.1, 3.1, Stanford CS229 note
Demo: Curve fitting
Oct 7 Machine Learning Methodology [slides] Reading: P. Domingos, A few useful things to know about machine learning
Oct 8 Learning Theory, Basic Probability Review [slides] Assg1 out
Reading: Daume III 12, Barber 1.1-1.4, CIS 520 note
E. Simoncelli, A Geometric Review of Linear Algebra
Video: Probability Primer
Demo: Seeing Theory: A visual introduction to probability and statistics
Oct 14 Statistical Estimation: MLE [slides] Reading: Murphy 2.1-2.3.2
Video: Daphne Koller, MLE Lecture, MAP Lecture
Oct 15 Statistical Estimation: MAP, Naïve Bayes Classifier [slides] Reading: Daume III 7, Naïve Bayes, Tom M. Mitchell
Optional Reading: Learning to Decode Cognitive States from Brain Images, Tom M. Mitchell et al.
Demo: Bayes Theorem
Oct 21 Logistic Regression, Discriminant vs. Generative Classification [slides] Reading: SLP3 5
Optional Reading: On Discriminative vs. Generative classifiers: A comparison of logistic regression and naive Bayes, Andrew Y. Ng, Michael I. Jordan
Oct 22 Linear Discriminant Functions, Perceptron [slides] Assg1 due, Assg2 out
Reading: Bishop 4.1.1-4.1.2, 4.5, Daume III 3
Oct 28 National Holiday (Republic Day)
Oct 29 National Holiday (Republic Day) Course project proposal due
Nov 4 Multi-layer Perceptron [slides] Reading: Bishop Ch. 5.1
Nov 5 Training Neural Networks: Computational Graph, Back-propagation [slides] Assg2 due
Reading: CS 231 Backpropagation notes
Demo: A Neural Network Playground
Nov 11 Introduction to Deep Learning [slides] Reading: Deep Learning, Yann LeCun, Yoshio Bengio, Geoffrey Hinton
Nov 12 Deep Convolutional Networks [slides] Reading: Conv Nets: A Modular Perspective, Understanding Convolutions, Christopher Olah
Nov 18 Midterm review
Nov 19 Midterm exam Assg3 out
Nov 25 Support Vector Machines (SVMs) [slides] Reading: Alpaydin 13.1-13.2
Video: Patrick Winston, Support Vector Machines
Nov 26 Soft margin SVM, Multi-class SVM [slides] Reading: Alpaydin 13.3, 13.9, M.A. Hearst, Support Vector Machines, CS229 Notes 3.7
Demo: Multi-class SVM demo
Dec 2 Kernels, Kernel Trick for SVMs, Support Vector Regression
Dec 3 Decision Tree Learning Assg3 due
Dec 9 Ensemble Methods: Bagging, Random Forests
Dec 10 Ensemble Methods: Boosting Project progress reports due
Dec 16 Clustering: K-Means, Spectral Clustering, Agglomerative Clustering
Dec 17 Dimensionality Reduction: PCA, SVD, ICA, Autoencoders
Dec 23 Project presentations
Dec 24 Project presentations, Course wrap-up Final project reports due

Resources

Related Conferences

  • Advances in Neural Information Processing Systems (NeurIPS)
  • International Conference on Machine Learning (ICML)
  • The Conference on Uncertainty in Artificial Intelligence (UAI)
  • International Conference on Artificial Intelligence and Statistics (AISTATS)
  • IEEE International Conference on Data Mining (ICDM)

Related Journals

  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Journal of Machine Learning Research
  • Data Mining and Knowledge Discovery
  • IEEE Transactions on Neural Networks

Python Resources

Linear Algebra

Resources for scientific writing and talks