Machine Learning (CS 697) Machine Learning
CS 697AB
Fall 2017

Instructor : Kaushik Sinha
Class Schedule : TR 8:00-9:15 (JB 226)
Office Hours : TR 9:45-10:45 (JB 243) or by appointment.
Email:
Grader : Ali Behfarnia
Grader's office hours: Thursday 5:00-6:00, WH 309
Grader's email: axbehfarnia@shockers.wichita.edu


Course Description
The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, recommender systems, gene discovery, financial forecasting, credit card fraud detection, speech recognition, bioinformatics and others.

This introductory machine learning course will give an overview of many models and algorithms used in modern machine learning, including supervised and unsupervised learning. Supervised Learning will deal with classification and regression problems and will include topics such as : k-nearest-neighbors, decision trees, linear regression and ridge regression, linear discriminant analysis, perceptron, support vector machines, kernel methods, naive Bayes classifier, logistic regression, boosting and bagging. The unsupervised learning will include topics such as : k-means clustering, spectral clustering, hierarchical clustering, mixture of Gaussians and EM algorithm, linear and non-linear dimension reduction techniques. The course will also cover topics such as bias-variance trade off, model selection, generative vs discriminative models, parametric vs non-parametric learning etc.

The course will give the student the basic ideas and intuition behind these methods, as well as, a more formal understanding of how and why they work. Students will have an opportunity to experiment with machine learning techniques and apply them to a selected problem in the context of a class project.


Prerequisite
CS 560 and MATH 511 and STAT 460 or IME 254 with a grade of C or better.

Matlab
The course will extensively use Matlab. An excellent guide for Matlab can be found in MATLAB Tutorial.
Matlab is available on all the Linux lab computers. There is an icon for Matlab under "Applications/Programming". If you can't find the icon, you can open a terminal and type "matlab".

Matlab is installed on all the student linux servers, e.g., kira.cs.wichita.edu. You can remotely connect to these machines using SSh and X11 forwarding. Please see the instructions below.
Instructions for logging into EECS Linux server.

Books
Introduction to Machine Learning (3rd edition) by Ethem Alpaydin, The MIT Press.

Data for Homework assignments
Digit data (HW1)
Spam training, Spam testing, Training labels, Test labels (HW2)
Housing data (HW 3)
Digit training, Digit testing (HW 4)
README_LIBSVM (HW 5)
Project
Final project Handout
Cluster comparison methods (Please read section 5.)
Dataset (To download, right click and use 'save link as' option)
NMI (This is matlab function (nmi.m) to compute NMI. To download, right click and use 'save link as' option)
Grading
Unless otherwise specified, the grading components of this class will comprise the following. There will be 6-7 homework assigments which will include programming assignments, two midterm exams and a project.

Item Weightage
Homework Assignments (including programming assignments) 45%
Midterm 1 15%
Midterm 2 15%
Project 20%
Class participation 5%

Schedule
WeekLectureDateTopicReadingSlidesHomework
11Aug 22Class Logistics, Overview of ML Overview
2Aug 24Introduction to MLChapter 1 Chapter 1
23Aug 29Supervised learning Chapter 2Chapter 2
4Aug 31Basic probability, linear algebra review, MATLAB demoHW1
35Sep 05Appendix A of your textbook, Bayesian decision theory, Naive Bayes classifierChapter 3, Appendix AChapter 3
6Sep 07Bayesian decision theory, Naive Bayes classifier contd.HW2
47Sep 12 Bayesian classification, parametric methodsChapter 4Chapter 4
8Sep 14Parametric method contd.
59Sep 19 Parametric methods,Multivariate methodsChapter 5 Chapter 5
10Sep 21Parametric method contd. HW3
611Sep 26 Nonparametric methods Chapter 8 Chapter 8
12Sep 28Nonparametric methods contd., Decision Trees Chapter 9.1-9.4Chapter 9HW4
713Oct 03 Decision trees contd.
14Oct 05Linear Discriminant Methods Chapter 10 Chapter 10
815Oct 10 Review
16Oct 12Midterm I
9Oct 17, Fall Break, No Class
17Oct 19Linear Discriminant Methods contd
1018Oct 24 SVM Chapter 13.2, 13.3 SVM, Optimization basics
19Oct 26Kernel methods Chapter 13.5-13.8Kernel methodsHW5
1120Oct 31 Kernel methods
21Nov 02ClusteringChapter 7.1-7.4, 7.8, 7.9 Chapter 7HW6
1222Nov 07 Clustering contd.
23Nov 09Dimension reduction Chapter 6.1-6.4, 6.6, 6.8. 6.9Chapter 6, PCA, LDA
1324Nov 14
25Nov 16Review
1426Nov 21 Midterm II
Nov 23, Thanksgiving Holiday, No Class
1527Nov 28 Project
28Nov 30Project
1629Dec 05Project
30Dec 07

Linear Algebra Review