Friday, May 14, 2010

Scaling data attributes before using SVM!

2.2 Scaling
Scaling before applying SVM is very important. Part 2 of Sarle's Neural Networks
FAQ Sarle (1997) explains the importance of this and most of considerations also apply
to SVM. The main advantage of scaling is to avoid attributes in greater numeric
ranges dominating those in smaller numeric ranges. Another advantage is to avoid
numerical di culties during the calculation. Because kernel values usually depend on
the inner products of feature vectors, e.g. the linear kernel and the polynomial kernel,
large attribute values might cause numerical problems. We recommend linearly
scaling each attribute to the range [-1; +1] or [0; 1].
Of course we have to use the same method to scale both training and testing
data. For example, suppose that we scaled the rst attribute of training data from
[-10; +10] to [-1; +1]. If the rst attribute of testing data lies in the range [-11; +8],
we must scale the testing data to [-1:1; +0:8]. See Appendix B for some real examples.

No comments:

Post a Comment