# SVM with linear features

This post derives a result from the 2010 PAMI paper of Ashraf, Lucey and Chen, Reinterpreting the Application of Gabor Filters as a Manipulation of the Margin in Linear Support Vector Machines. Thanks to Mark Cox for discussion.

The support vector machine optimisation problem is

Here we consider the case where the feature transform is linear , such that . This can also be expressed as the constrained optimisation problem

We now look to remove from this problem. There are a few different cases depending on the shape and rank of the matrix.

### Higher dimension, full rank

Let’s assume that is where and the rank of is (the maximum possible). This is where is increasing the dimension of the examples without losing any information. The constraint implies that

which importantly means that alone does not need to be constrained. However, for a given there will be many that satisfy the constraint.

To resolve the ambiguity of , we examine the regularisation term. First, let’s decompose into its “thin” SVD, where is , is and is and diagonal. We have and . Then

All solutions for can be parametrised

for arbitrary where is and the basis for . It satisfies and . Note that is the pseudo-inverse.

Introducing this result into the regularisation term, we see

and then the objective becomes separable

Thus the solution is the same as the original SVM, except the margin is measured in a weighted space.

### Lower dimension, full rank

When things are not so simple. The constraint will restrict

While it is still possible to make the substitution , however it is no longer the case that , and instead we must use the norm weighted by the pseudo-inverse in the constrained optimisation problem

This might be considered a weighted margin where some of the eigenvalues are infinite, confining the solution to a subspace.