Announcements
- 4/22:
Project presentations are scheduled for May 6 & 8. Check Bb for a list of projects: the first
half is assigned to present on the 6th and the rest on the 8th.
- 4/22:
HW5 can be downloaded on Blackboard, due May 6
- 4/10:
HW4 can be downloaded on Blackboard, due April 22
- 3/24:
HW3 is posted on Blackboard, due April 8
- 3/4:
Final exam is on Tue May 20 at 11:15am - 1:45pm (150 minutes) in the class room. See here
- 3/4:
Midterm exam is on Mar 13 in class during class time.
- 3/4:
HW2 is posted on Blackboard, due Mar 11
- 2/18:
Due to class cancellation, HW1 due date is pushed to Thur Feb 27.
This affected following homeworks, please check under Assignments.
- 2/1:
TA information is updated.
- 2/1:
You can find the course lectures posted on the Blackboard.
- 1/1:
Welcome to the class! Hope you will enjoy it :)
PEOPLE:
Instructor: Leman Akoglu
- Office: 1425 Computer Science
- Office hours: Tue 2:30PM - 3:30PM
- Email: invert (cs.stonybrook.edu @ leman)
Teaching Assistant: Syed Masum Billah
- Office: 1205 Computer Science
- Office hours: Thur 4:00PM - 5:00PM (Note: office hours will be held at 2110 CS)
- Email: invert (cs.stonybrook.edu @ sbillah)
CLASS MEETS:
Time: Tue & Thu 5:30PM - 6:50PM
Place: FREY HALL Rm 105
COURSE DESCRIPTION:
We are drowning in information and starving for knowledge. — John Naisbitt
Machine Learning is centered around automated methods that improve their own performance through learning patterns in data, and then use the uncovered patterns to predict the future and make decisions.
Examples include document/image/handwriting classification, spam filtering, face/speech recognition, medical decision making, robot navigation, to name a few.
See
this for an extended introduction.
This course covers the theory and practical algorithms for machine learning from a variety of perspectives.
The topics include Bayesian networks, decision tree learning, Support Vector Machines, statistical learning methods and unsupervised learning, as well as
theoretical concepts such as the PAC learning framework, margin-based learning, and VC dimension.
Short programming assignments include hands-on experiments with various learning algorithms, and a larger course project gives students a chance to dig into an area of their choice.
See the
syllabus for more.
This course is designed to give a graduate-level student a thorough grounding in the methodologies, technologies, mathematics and algorithms currently needed by people who do research in machine learning.
RECOMMENDED TEXTBOOKS:
There is no official textbook for the course. I will post all the lecture notes and several readings on course website.
Below you can find a list of recommended reading. We will follow mostly the first book for the course.
- Christopher M. Bishop,
"Pattern Recognition and Machine Learning," Springer, 2011.
- Kevin P. Murphy,
"Machine Learning: a Probabilistic Perspective," The MIT Press, 2012. (optional)
- Tom Mitchell,
"Machine Learning," McGraw Hill, 1997. (optional)
- Ethem Alpaydin,
"Introduction to Machine Learning," The MIT Press, 2004. (optional)
- Trevor Hastie, Robert Tibshirani, Jerome Friedman
The Elements of Statistical Learning: Data Mining, Inference, and Prediction," FREE! (optional)
BULLETIN BOARD and other info
MISC - FUN:
Fake (ML) protest