SIAM Student Seminar
Friday, March 6, 2009 - 12:30pm
In this talk, I will briefly introduce some basics of mathematical learning theory. Two basic methods named perceptron algorithm and support vector machine will be explained for the separable classification case. Also, the subgaussian random variable and Hoeffding inequality will be mentioned in order to provide the upper bound for the deviation of the empirical risk. If time permits, the Vapnik combinatorics will be involved for shaper bounds of this deviation.