Contents

## PEGASOS Primal Efficient sub-GrAdient SOlver as long as SVM Support Vector Machines Outline Previous Work PEGASOS

Creno, Glen, General Assignment Reporter has reference to this Academic Journal, PHwiki organized this Journal PEGASOS Primal Efficient sub-GrAdient SOlver as long as SVM Shai Shalev-Shwartz Yoram Singer Nati Srebro The Hebrew University Jerusalem, Israel YASSO = Yet Another Svm SOlver Support Vector Machines QP as long as m: More natural as long as m: Empirical loss Regularization term Outline Previous Work The Pegasos algorithm Analysis faster convergence rates Experiments outper as long as ms state-of-the-art Extensions kernels complex prediction problems bias term

This Particular University is Related to this Particular Journal

Previous Work Dual-based methods Interior Point methods Memory: m2, time: m3 log(log(1/)) Decomposition methods Memory: m, Time: super-linear in m Online learning & Stochastic Gradient Memory: O(1), Time: 1/2 (linear kernel) Memory: 1/2, Time: 1/4 (non-linear kernel) Typically, online learning algorithms do not converge to the optimal solution of SVM Better rates as long as finite dimensional instances (Murata, Bottou) PEGASOS Subgradient Projection A-t = S Subgradient method A-t = 1 Stochastic gradient Run-Time of Pegasos Choosing At=1 in addition to a linear kernel over Rn Run-time required as long as Pegasos to find accurate solution w.p. ¸ 1- Run-time does not depend on examples Depends on difficulty of problem ( in addition to )

Formal Properties Definition: w is accurate if Theorem 1: Pegasos finds accurate solution w.p. ¸ 1- after at most iterations. Theorem 2: Pegasos finds log(1/) solutions s.t. w.p. ¸ 1-, at least one of them is accurate after iterations Proof Sketch A second look on the update step: Proof Sketch Lemma (free projection): Logarithmic Regret as long as OCP (Hazan et al06) Take expectation: f(wr)-f(w) ¸ 0 Markov gives that w.p. ¸ 1- Amplify the confidence

Experiments 3 datasets (provided by Joachims) Reuters CCAT (800K examples, 47k features) Physics ArXiv (62k examples, 100k features) Covertype (581k examples, 54 features) 4 competing algorithms SVM-light (Joachims) SVM-Perf (Joachims06) Norma (Kivinen, Smola, Williamson 02) Zhang04 (stochastic gradient descent) Source-Code available online Training Time (in seconds) Compare to Norma (on Physics) obj. value test error

Compare to Zhang (on Physics) Objective But, tuning the parameter is more expensive than learning Effect of k=At when T is fixed Objective Effect of k=At when kT is fixed Objective

I want my kernels ! Pegasos can seamlessly be adapted to employ non-linear kernels while working solely on the primal objective function No need to switch to the dual problem Number of support vectors is bounded by Complex Decision Problems Pegasos works whenever we know how to calculate subgradients of loss func. l(w;(x,y)) Example: Structured output prediction Subgradient is (x,y)-(x,y) where y is the maximizer in the definition of l bias term Popular approach: increase dimension of x Cons: pay as long as b in the regularization term Calculate subgradients w.r.t. w in addition to w.r.t b: Cons: convergence rate is 1/2 Define: Cons: At need to be large Search b in an outer loop Cons: evaluating objective is 1/2

Discussion Pegasos: Simple & Efficient solver as long as SVM Sample vs. computational complexity Sample complexity: How many examples do we need as a function of VC-dim (), accuracy (), in addition to confidence () In Pegasos, we aim at analyzing computational complexity based on , , in addition to (also in Bottou & Bousquet) Finding argmin vs. calculating min: It seems that Pegasos finds the argmin more easily than it requires to calculate the min value

## Creno, Glen General Assignment Reporter

Creno, Glen is from United States and they belong to Arizona Republic and they are from Phoenix, United States got related to this Particular Journal. and Creno, Glen deal with the subjects like Local News; Regional News

## Journal Ratings by Art Center College of Design

This Particular Journal got reviewed and rated by Art Center College of Design and short form of this particular Institution is US and gave this Journal an Excellent Rating.