CSI 5325: Introduction to Machine Learning, Spring 2010

Objectives

This is a course in machine learning, which is a broad, interesting, and fast-growing field. The central problem we address in this class is how to use the computer to make models which can learn, make inferences, or improve its behavior, based on observations about the world. Further, we would like to use the learned models to make predictions about unknowns.

Machine learning is related to artificial intelligence, but also uses a lot of computer science, statistics, logic, probability, information theory, geometry, linear algebra, calculus, optimization theory, etc. It would be good to brush up on these topics if they're rusty.

Practical information

Lectures are from 2:30 to 3:45 PM in Rogers 106 on Wednesdays and Fridays.

My office hours are listed on my home page. I am glad to talk to students during and outside of office hours. If you can't come to my office hour, please email me to make an appointment at another time.

Schedule

Here is a schedule of the material we will cover, which is subject to change:

Week Dates Topic Assignment Reading Wednesday Friday
1 Jan 11-17 Introduction; concept learning Assignment 1 1, 2 Introduction Concept learning
2 Jan 18-24 Decision trees 2, 3 Concept learning Decision trees
3 Jan 25-31 Assignment 2 3 Decision trees Decision trees
4 Feb 1-7 Neural networks 4 Neural networks Neural networks
5 Feb 8-14 Evaluating hypotheses Assignment 3 4, 5 Neural networks Evaluating hypotheses
6 Feb 15-21 Bayesian learning 5, 6 Evaluating hypotheses Bayesian learning
7 Feb 22-28 6 Bayesian learning Bayesian learning
8 Mar 1-7 Learning theory 7 Learning theory MIDTERM
9 Mar 8-14 Spring break Spring break
10 Mar 15-21 Assignment 4 7 Learning theory Learning theory
11 Mar 22-28 Instance-based learning 8 Instance-based learning Instance-based learning
12 Mar 29-Apr 4 Support vector machines Burges paper; Alpaydin chapter; Schölkopf chapter Instance-based learning Support vector machines
13 Apr 5-11 Assignment 5 Burges paper; Alpaydin chapter; Schölkopf chapter Support vector machines Support vector machines
14 Apr 12-18 Unsupervised learning Duda et al. chapter 10 Unsupervised learning Unsupervised learning
15 Apr 19-25 Boosting Hidden Markov models
16 Apr 26-May 2 Boosting Freund and Schapire Hidden Markov models Project presentation

The final exam will be on Tuesday, May 11 at 11:30 AM. For the latest university finals information, check here.

Paper reading/presentation component

As a part of the assignments, each student will present one paper on a topic which we are studying. Here is the list of topics, papers and people who are signed up.

Everyone in the class is expected to read the paper that is presented so we may have a fruitful discussion.

Textbooks & resources

Required text: we will be using Machine Learning by Tom Mitchell.

We will also be reading from some papers in the research literature.

Optional texts (I may draw some material from these):

Further online resources:

Grading

Grades will be assigned based on this breakdown:

Here is a tentative grading scale:
A: 90-100, B+: 88-89, B: 80-87, C+: 78-79, C: 70-77, D: 60-69, F: 0-59

Some projects may be worth more than others. Exams are closed-book. The final will be comprehensive.

Policies

Academic honesty

I take academic honesty very seriously.

Many studies, including one by Sheilah Maramark and Mindi Barth Maline have suggested that "some students cheat because of ignorance, uncertainty, or confusion regarding what behaviors constitute dishonesty" (Maramark and Maline, Issues in Education: Academic Dishonesty Among College Students, U.S. Department of Education, Office of Research, August 1993, page 5). In an effort to reduce misunderstandings in this course, a minimal list of activities that will be considered cheating have been listed below.


Copyright © 2010 Greg Hamerly, with some content taken from a syllabus by Jeff Donahoo.
Computer Science Department
Baylor University

Valid html and css.