This is a undergraduate-level introductory course in machine learning (ML) which will give a broad overview of many concepts and algorithms in ML, ranging from supervised learning methods such as support vector machines and decision trees, to unsupervised learning (clustering and factor analysis). The goal is to provide students with a deep understanding of the subject matter and skills to apply these concepts to real world problems. The course is taught by Aykut Erdem - the teaching assistant is Necva Bolucu.
Lectures: Thursdays at 16:00-16:50 and Fridays 09:00-10:50 (Room D9)
Tutorials: Tutorials: Fridays at 13:30-15:30 (Room D8)
Policies: All work on assignments must be done individually unless stated otherwise. You are encouraged to discuss with your classmates about the given assignments, but these discussions should be carried out in an abstract way. That is, discussions related to a particular solution to a specific problem (either in actual code or in the pseudocode) will not be tolerated.
In short, turning in someone else’s work, in whole or in part, as your own will be considered as a violation of academic integrity. Please note that the former condition also holds for the material found on the web as everything on the web has been written by someone else.
The course webpage will be updated regularly throughout the semester with lecture notes, presentations, assignments and important deadlines. All other course related communications will be carried out through Piazza. Please enroll it by following the link https://piazza.com/hacettepe.edu.tr/fall2018/bbm406.
BBM406 is open to third/fourth-year undergraduate and M.Sc. students. Non-CENG graduate students should ask the course instructor for approval before the add/drop period. The prerequisites for this course are:
Grading for BBM406 will be based on
Grading for BBM409 will be based on
Date | Topic | Notes |
Oct 11 | Course outline and logistics, An overview of Machine Learning [slides] | Reading: The Discipline of Machine Learning, Tom Mitchell Video 1: The Master Algorithm, Pedro Domingos Video 2: The Thinking Machine |
Oct 12 | Machine Learning by Examples, Nearest Neighbor Classifier [slides] | Reading: Barber 1,14.1-14.2 Demo: k-Nearest Neighbors Tutorial: Python/numpy |
Oct 18 | Kernel Regression, Distance Functions, Curse of Dimensionality [slides] | Reading: Bishop 1.4, 2.5 |
Oct 19 | Linear Regression, Generalization, Model Complexity, Regularization | Assg1 out Reading: Bishop 1.1, 3.1, Stanford CS229 note Demo: Linear regression |
Oct 25 | Machine Learning Methodology | Reading: P. Domingos, A few useful things to know about machine learning |
Oct 26 | Learning Theory, Basic Probability Review | Reading: Daume III 12, Barber 1.1-1.4, CIS 520 note E. Simoncelli, A Geometric Review of Linear Algebra Video: Probability Primer Demo: Seeing Theory: A visual introduction to probability and statistics |
Nov 1 | Statistical Estimation: MLE and MAP | Reading: Murphy 2.1-2.3.2 Video: Daphne Koller, Probabilistic Graphical Models, MLE Lecture, MAP Lecture |
Nov 2 | Statistical Estimation: MLE and MAP (cont'd.), Naïve Bayes Classifier | Assg1 due Reading: Daume III 7, Naïve Bayes, Tom M. Mitchell Optional Reading: Learning to Decode Cognitive States from Brain Images, Tom M. Mitchell et al. Demo: Bayes Theorem |
Nov 8 | Logistic Regression, Discriminant vs. Generative Classification | Reading: Barber 17.4, Bishop 4.1.1-4.1.2 Optional Reading: On Discriminative vs. Generative classifiers: A comparison of logistic regression and naive Bayes, Andrew Y. Ng, Michael I. Jordan |
Nov 9 | Linear Discriminant Functions, Perceptron | Assg2 out Reading: Bishop 4.5, Daume III 3 |
Nov 15 | Multi-layer Perceptron | |
Nov 16 | Training Neural Networks: Computational Graph, Back-propagation | Course project proposal due Reading: CS 231 Backpropagation notes Demo: A Neural Network Playground |
Nov 22 | Introduction to Deep Learning | Reading: Deep Learning, Yann LeCun, Yoshio Bengio, Geoffrey Hinton |
Nov 23 | Deep Convolutional Networks | Assg2 due Reading: Conv Nets: A Modular Perspective, Understanding Convolutions, Christopher Olah |
Nov 29 | Midterm review | |
Nov 30 | Midterm exam | Assg3 out |
Dec 6 | Support Vector Machines (SVMs) | Reading: Alpaydin 13.1-13.2 Video: Patrick Winston, Support Vector Machines Demo: Andrea Vedaldi's SVM MATLAB demo |
Dec 7 | Soft margin SVM, Multi-class SVM | Reading: Alpaydin 13.3, 13.9, M.A. Hearst, Support Vector Machines, CS229 Notes 3.7 Demo: Multi-class SVM demo |
Dec 13 | Kernels, Kernel Trick for SVMs, Support Vector Regression | Reading: 13.5-13.7, 13.10 |
Dec 14 | Decision Tree Learning | Assg3 due Reading: Mitchell 3, Bishop 14.4 Demo: A Visual Introduction to Machine Learning |
Dec 20 | Ensemble Methods: Bagging, Random Forests | Reading: Bishop 14.1-14.2, Understanding the Bias-Variance Tradeoff, Scott Fortmann-Roe, Random Forests, Leo Breiman and Adele Cutler Optional Reading: Real-Time Human Pose Recognition in Parts from Single Depth Images, Jamie Shotton et al. Demo: Bootstrapping |
Dec 21 | Ensemble Methods: Boosting | Project progress reports due Reading: Bishop 14.3, Optional Reading: Rapid Object Detection using a Boosted Cascade of Simple Features, Paul Viola and Michael Jones Video: A Boosting Tutorial, Robert Schapire |
Dec 27 | Clustering: K-Means | Reading: Bishop 9.1 Cluster Analysis: Basic Concepts and Algorithms, Pang-Ning Tan, Michael Steinbach and Vipin Kumar Demo: Visualizing K-Means equilibria |
Dec 28 | Clustering: Spectral Clustering, Agglomerative Clustering | |
Jan 3 | Dimensionality Reduction: Principle Component Analysis, Singular Value Decomposition | Reading: Barber 15.1-15.3, 15.7, Stanford CS229 note Video: PCA, Andrew Ng Demo: Principal Component Analysis Explained Visually |
Jan 4 | Dimensionality Reduction: Autoencoders, Independent Component Analysis | |
Jan 10 | Project presentations | |
Jan 11 | Project presentations (cont'd.), Course wrap-up | Final project reports due |