Pattern Recognition with Support Vector Machines - First International Workshop, SVM 2002, Niagara Falls, Canada, August 10, 200
With their introduction in 1995, Support Vector Machines (SVMs) marked the beginningofanewerainthelearningfromexamplesparadigm.Rootedinthe Statistical Learning Theory developed by Vladimir Vapnik at AT&T, SVMs quickly gained attention from the pattern recognition community due to a n- beroftheoreticalandcomputationalmerits.Theseinclude,forexample,the simple geometrical interpretation of the margin, uniqueness of the solution, s- tistical robustness of the loss function, modularity of the kernel function, and over?t control through the choice of a single regularization parameter. Like all really good and far reaching ideas, SVMs raised a number of - terestingproblemsforboththeoreticiansandpractitioners.Newapproachesto Statistical Learning Theory are under development and new and more e?cient methods for computing SVM with a large number of examples are being studied. Being interested in the development of trainable systems ourselves, we decided to organize an international workshop as a satellite event of the 16th Inter- tional Conference on Pattern Recognition emphasizing the practical impact and relevance of SVMs for pattern recognition.
By March 2002, a total of 57 full papers had been submitted from 21 co- tries.Toensurethehighqualityofworkshopandproceedings,theprogramc- mitteeselectedandaccepted30ofthemafterathoroughreviewprocess.Ofthese papers16werepresentedin4oralsessionsand14inapostersession.Thepapers span a variety of topics in pattern recognition with SVMs from computational theoriestotheirimplementations.Inadditiontotheseexcellentpresentations, there were two invited papers by Sayan Mukherjee, MIT and Yoshua Bengio, University of Montreal.