203.4770 (3770)
Semester A
Meeting Times: Monday 912,
Room
Instruction Hour: Wednesday
11:0012:00, Room 410 (Jacobs)
Machine learning is concerned with the development of computer algorithms that are able to learn solving tasks given a set of examples of those tasks and some prior knowledge about them. Machine learning has a wide spectrum of applications including handwritten or speech recognition, image classification, medical diagnosis, stock market analysis, bioinformatics etc. The goal of this course is to present the main concepts of modern machine learning methods including some theoretical background.
Recommended Prerequisites
The course assumes some basic knowledge of probability
theory and linear algebra,
for example, you should be familiar with
Tutorials of the above topics.
Problems, Concepts, Methods, and Tools within in the
course
The list is partial and be can changed.
Problems
Concepts
Models and Methods
Tools
The course will furthermore use several reallife applications to illustrate the interest of statistical machine learning.
Requirements
1) Home assignments 020% of the final grade (could be done
in pairs but the pairs should be the same for all assignments).
2) Final exam 80100%
Announcements
§ NEW: You are allowed to bring your notes limited to a doublesided A4 page
in handwriting or in font size 10 or larger.
§ NEW: Assinment 2 was distributed
via email. Due 19/02/2017. If you have not received it send me an email.
§ Home Assinment 1 was distributed via email. Due 05/01/2017. If
you have not received it send me an email.
Lecture
Notes
Date 
Topic 
Lecture notes 
Reading material 
31.10 
Overview Introduction to Classification Started probability tutorial
(see below) 


7.11 
Introduction to Probability Bayesian Decision Theory, ML,
MAP classifiers 

D.H.S. “Pattern Classification” Sections:
2.12.3 
14.11 
Bayesian Decision Theory, ML,
MAP classifiers Bayesian Decision Tutorial 


21.11 
Normal Variables and their
discriminant functions Parametric Density Estimation:
Maximum Likelihood Estimation 

D.H.S. “Pattern Classification” Sections: 2.42.6 D.H.S. “Pattern Classification” Sections: 3.23.4, 3.5(3.5.1 only). 
28.11 
MLE
Tutorial Parametric Density Estimation:
Bayesian Estimation. Naïve Bayes 

D.H.S. “Pattern Classification” Sections: 3.23.4, 3.5(3.5.1 only). 
5.12 
No class 

12.12 
Nonparametric
density estimation, Histogram, Parzen Window, KNN 

D.H.S.
“Pattern Classification” 4.14.3.4, 4.44.5,4.5.4,4.5.5,4.6 
19.12 
PCA, FDA,MDA 

D.H.S.
“Pattern Classification” 3.73.8 
26.12 
LDF MSE 

D.H.S. “Pattern Classification” 5.25.4, 5.5.1 5.7,5.8,5.8.1,5.8.4 D.T.S. 5.11 
2.01 
SVM Multiclass classification 

D.H.S.
“Pattern Classification” 5.11 
9.01 
Decision Trees Linear Regression 
PDF
(additional) 
Hastie,
Tibshirani Friedman “ The Elements of Statistical
Learning” 3.1,3.2, 3.3, 3.4.3, 3.4.5. 
16.01 
Linear Regression Boosting 
Cont. 
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.56.9855 
23.01 
Computational Learning Theory Complexity, VC dimension 


25.01 
clustering 






Home
Assignments:
General Instructions
·
We will have 23 assignments this semester.
·
You should submit a pdf file of the report and your
implementation (running code) in a digital form. Zip it together and send me by
email.
·
Identical (or very similar solutions) are not allowed!
Probability tutorials:
http://wwwstat.stanford.edu/~susan/courses/s116/
Linear Algebra tutorial:
MATLAB
resources:
Matlab is
installed in the computer labs in Jacobs building.
For a student license see:
http://www.haifa.ac.il/index.php/he/20151119071650
Introductory Tutorial
MATLAB tutorial
from Carnegie Mellon University
Slightly more advanced Tutorial
More complete references/tutorials/FAQs