CSI 5v93: Machine Learning, Spring 2006

Announcements

Mon Jan 9 10:58:04 CST 2006
Welcome to the course! The first assignment is posted.

Objectives

This is a course in machine learning, a subfield of artificial intelligence. Machine learning is a very big, interesting, and fast-growing field. The central problem we address in this class is how to use the computer to make models which can learn, or make inferences, from data. Further, we would like to use the learned models to make predictions about unknowns.

This course covers:

This list of topics is optimistic. Be prepared to invest the time necessary to understand the concepts, and to do the programming projects. My best advice is to attend the lectures, read the book, ask questions, and start projects early.

Practical information

Lectures are from 11:00 AM to 11:50 AM in Rogers 210 on Mondays, Wednesdays, and Fridays.

My office hours are listed on my home page. I am glad to talk to students during and outside of office hours. If you can't come to my office hour, please email me to make an appointment at another time.

Schedule

Here is an aggressive schedule of the material we will cover:

Week Dates New topics Reading Lecture notes Monday Wednesday Friday
1 Jan 9-13 Introduction, linear models 1, 2 Homework 1
2 Jan 16-20 MLK holiday
3 Jan 23-27 Linear regression methods 3
4 Jan 30-Feb 3 Homework 2
5 Feb 6-10 Linear classification methods 4
6 Feb 13-17
7 Feb 20-24 Support Vector Machines 4.5, 12.1-12.3
8 Feb 27-Mar 3 Homework 3
9 Mar 6-10 Bayesian learning Mitchell 6 Midterm
Mar 13-17 Spring break
10 Mar 20-24
11 Mar 27-31 Unsupervised learning 14
12 Apr 3-7
13 Apr 10-14 ICPC ICPC Easter holiday
14 Apr 17-21 Student presentations Easter holiday Josh - HMMs Sean - SVM probabilities
15 Apr 24-28 Student presentations Greg - Decision trees Noah - Boosting Joe - Feature selection
16 May 1-5 Student presentations Bharat - ?Study days

The final exam is on Tuesday, May 9th, between 9-11 AM. The latest university finals information is available here.

Textbooks & resources

Required text: we will be using The Elements of Statistical Learning: Data Mining, Inference, and Prediction by Trevor Hastie, Robert Tibshirani, and Jerome Friedman. You can purchase this book from the Baylor bookstore or amazon, among other places.

Optional text: Machine Learning by Tom Mitchell.

Further online resources:

Grading

Grades will be assigned based on this breakdown:

Here is a tentative grading scale:
A: 90-100, B+: 88-89, B: 80-87, C+: 78-79, C: 70-77, D: 60-69, F: 0-59

Some projects may be worth more than others. Exams are closed-book. The final will be comprehensive.

Policies

Academic honesty

I take academic honesty very seriously.

Many studies, including one by Sheilah Maramark and Mindi Barth Maline have suggested that "some students cheat because of ignorance, uncertainty, or confusion regarding what behaviors constitute dishonesty" (Maramark and Maline, Issues in Education: Academic Dishonesty Among College Students, U.S. Department of Education, Office of Research, August 1993, page 5). In an effort to reduce misunderstandings in this course, a minimal list of activities that will be considered cheating have been listed below.


Copyright © 2006 Greg Hamerly, with some content taken from a syllabus by Jeff Donahoo.
Computer Science Department
Baylor University

valid html and css