Fast Rates for Regularized Objectives Karthik Sridharan, Nathan Srebro, Shai Shalev-Shwartz TTI-Chicago W40 Main result: empirical minimization of SVM objective converges with a "fast" rate of 1/n to the infinite data limit (the population optimum) · No "low noise" or any other assumptions · General result for any "generalized linear" strongly convex stochastic objective · Concentration result analogous to [HazanKalaiKaleAgarwal06] (log n)/n average online regret for online optimization · 1/n rate on SVM objective necessary for obtaining simple oracle inequalities on the expected loss of SVM