Saturday, December 15, 2007

The Fastest SVM

The quest for the fastest SVM learning algorithm is continuing.

Leon Bottou reported his suprising finding: a classic optimization method, Stochastic Gradient Descent, is amazingly fast for training linear SVMs or CRFs. His program svmsgd works much faster than SVMperf and LIBLINEAR on very large datasets such as RCV1-v2.

Edward Chang's team has released their code of PSVM, a parallel implementation of SVM that can achieve almost linear reduction in both memory use and training time.

2 comments:

Anonymous said...

Liblinear now has a very fast sequential dual method that is even faster than SGD. Check it out.

Anonymous said...

agree. liblinear is faster.