Course Information


This is a undergraduate-level introductory course in machine learning (ML) which will give a broad overview of many concepts and algorithms in ML, ranging from supervised learning methods such as support vector machines and decision trees, to unsupervised learning (clustering and factor analysis). The goal is to provide students with a deep understanding of the subject matter and skills to apply these concepts to real world problems. The course is taught by Aykut Erdem. There will be two teaching assistants: Levent Karacan and Tugba Gurgen Erdogan.


Time and Location

Lectures: Mondays at 10:00-11:50 and Thursdays 11:00-11:50 (Room D9)
Tutorials: Tutorials: Fridays at 13:00-15:00 (Room D8)

Reference Books

Policies: All work on assignments must be done individually unless stated otherwise. You are encouraged to discuss with your classmates about the given assignments, but these discussions should be carried out in an abstract way. That is, discussions related to a particular solution to a specific problem (either in actual code or in the pseudocode) will not be tolerated.

In short, turning in someone else’s work, in whole or in part, as your own will be considered as a violation of academic integrity. Please note that the former condition also holds for the material found on the web as everything on the web has been written by someone else.


The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through Piazza. Please enroll it by following the link


BBM406 is open to third/fourth-year undergraduate and M.Sc. students. Non-CENG graduate students should ask the course instructor for approval before the add/drop period. The prerequisites for this course are:

Course Requirements and Grading

Grading for BBM406 will be based on

Grading for BBM409 will be based on


Date Topic Notes
Sep 25 Course outline and logistics, An overview of Machine Learning [slides] Reading: The Discipline of Machine Learning, Tom Mitchell
Video 1: The Master Algorithm, Pedro Domingos
Video 2: The Thinking Machine
Sep 28 Nearest Neighbor Classifier [slides] Reading: Barber 1,14.1-14.2
Demo: k-Nearest Neighbors
Tutorial: Python/numpy
Oct 2 Kernel Regression, Distance Functions, Curse of Dimensionality, Introduction to Linear Regression [slides] Reading: Bishop 1.4, 2.5
Oct 5 Linear Regression [slides] Assg1 out
Reading: Bishop 1.1, 3.1, Stanford CS229 note
Demo: Linear regression
Tutorial: Linear algebra (notebook)
Oct 9 Generalization, Model Complexity, Regularization (cont'd), Machine Learning Methodology, Learning Theory [slides] Reading: P. Domingos, A few useful things to know about machine learning
Reading: Daume III 12
Oct 12 Basic Probability Review [slides] Reading: Barber 1.1-1.4, CIS 520 note
E. Simoncelli, A Geometric Review of Linear Algebra
Video: Probability Primer
Tutorial: Linear Regression, Cross-Validation
Demo: Seeing Theory: A visual introduction to probability and statistics
Tutorial: kNN and Linear Regression (notebook)
Oct 16 Statistical Estimation: MLE and MAP [slides] Reading: Murphy 2.1-2.3.2
Video: Daphne Koller, Probabilistic Graphical Models, MLE Lecture, MAP Lecture
Oct 19 Naïve Bayes Classifier [slides] Assg1 due, Assg2 out
Reading: Daume III 7, Naïve Bayes, Tom M. Mitchell
Optional Reading: Learning to Decode Cognitive States from Brain Images, Tom M. Mitchell et al.
Demo: Bayes Theorem
Tutorial: Naive Bayes
Oct 23 Logistic Regression, Discriminant vs. Generative Classification, Linear Discriminant Functions [slides] Reading: Barber 17.4, Bishop 4.1.1-4.1.2, 4.5
Optional Reading: On Discriminative vs. Generative classifiers: A comparison of logistic regression and naive Bayes, Andrew Y. Ng, Michael I. Jordan
Oct 26 Perceptron [slides] Reading: Daume III 3
Tutorial: Logistic Regression (notebook)
Oct 30 Multi-layer Perceptron [slides] Course project proposal due
Video: Neural Networks, Andrew Ng
Demo: A Neural Network Playground
Nov 2 Training Neural Networks: Computational Graph, Back-propagation [slides] Assg2 due
Reading: CS 231 Backpropagation notes
Tutorial: Perceptron
Nov 6 Midterm exam
Nov 9 Project discussion Assg3 out
Nov 13 Introduction to Deep Learning, Deep Convolutional Neural Networks [slides] Deep Learning, Yann LeCun, Yoshio Bengio, Geoffrey Hinton, Conv Nets: A Modular Perspective, Understanding Convolutions, Christopher Olah
Nov 16 Deep Convolutional Networks (cont'd.) [slides]
Nov 20 Support Vector Machines (SVMs), Soft margin SVM [slides] Assg3 due
Reading: Alpaydin 13.1-13.3
Video: Patrick Winston, Support Vector Machines
Demo: Andrea Vedaldi's SVM MATLAB demo
Nov 23 Multi-class SVM, Kernels [slides] Reading: Alpaydin 13.5-13.7, 13.9, M.A. Hearst, Support Vector Machines, CS229 Notes 3.7
Demo: Multi-class SVM demo
Nov 27 Kernel Trick for SVMs, Support Vector Regression [slides]
Nov 30 Decision Tree Learning [slides] Reading: Mitchell 3, Bishop 14.4
Demo: A Visual Introduction to Machine Learning
Dec 4 Ensemble Methods: Bagging, Random Forests, Boosting [slides] Project progress reports due
Reading: Bishop 14.1-14.3, Understanding the Bias-Variance Tradeoff, Scott Fortmann-Roe, Random Forests, Leo Breiman and Adele Cutler
Optional Reading: Real-Time Human Pose Recognition in Parts from Single Depth Images, Jamie Shotton et al.
Demo: Bootstrapping
Dec 7 Ensemble Methods: Boosting (cont'd.) [slides] Optional Reading: Rapid Object Detection using a Boosted Cascade of Simple Features, Paul Viola and Michael Jones
Video: A Boosting Tutorial, Robert Schapire
Dec 11 Clustering: K-Means [slides] Reading: Bishop 9.1
Cluster Analysis: Basic Concepts and Algorithms, Pang-Ning Tan, Michael Steinbach and Vipin Kumar
Demo: Visualizing K-Means equilibria
Dec 14 Clustering: Spectral Clustering, Agglomerative Clustering [slides]
Dec 18 Dimensionality Reduction: Principle Component Analysis, Singular Value Decomposition [slides] Reading: Barber 15.1-15.3, 15.7, Stanford CS229 note
Video: PCA, Andrew Ng
Demo: Principal Component Analysis Explained Visually
Dec 21 Dimensionality Reduction: Autoencoders, Independent Component Analysis [slides]
Dec 25 Project presentations
Dec 28 Course wrap-up Final project reports due


Related Conferences

  • Advances in Neural Information Processing Systems (NIPS)
  • International Conference on Machine Learning (ICML)
  • The Conference on Uncertainty in Artificial Intelligence (UAI)
  • International Conference on Artificial Intelligence and Statistics (AISTATS)
  • IEEE International Conference on Data Mining (ICDM)

Related Journals

  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Journal of Machine Learning Research
  • Data Mining and Knowledge Discovery
  • IEEE Transactions on Neural Networks

Python Resources

Linear Algebra

Resources for scientific writing and talks