- Series
- SIAM Student Seminar
- Time
- Friday, March 6, 2009 - 12:30pm for 2 hours
- Location
- Skiles 269
- Speaker
- Kai Ni – School of Mathematics, Georgia Tech
- Organizer
- Linwei Xin
In this talk, I will briefly introduce some basics of mathematical learning theory. Two basic methods named perceptron algorithm and support vector machine will be explained for the separable classification case. Also, the subgaussian random
variable and Hoeffding inequality will be mentioned in order to provide the upper bound for the deviation of the empirical risk. If time permits, the Vapnik combinatorics will be involved for shaper bounds of this deviation.