Solved – How to you make linear discriminant analysis reduce dimensions to the number of dimensions you are looking for

dimensionality reductiondiscriminant analysis

Let's say I have a $m \times n$ matrix where $m$ is the number of points and $n$ is the number of dimensions. I would like to give a target dimension parameter which is let's say d. d can be a set of values like $\{2,4,6,\ldots,n\}$. I would then approach the problem using Fisher's linear discriminant analysis to give me output matrix in $m \times d$. Is this possible to do? I still don't understand how LDA reduces dimensions from let's say 10,000 to 1,000.

Best Answer

How LDA does dimension reducion has same methodology with PCA. When you get J(W) at LDA (W is result that which minimizes within class scatter matrix but maximizes between class scatter matrix)

W = inv(Sw)*Sb

When you get W, you can get result with:

 V is eigenvectors of W
 final_result = V'*your_data

before that the trick plays role. As like at PCA, you can get eigenvectors and eigenvalues of W and you can discard some eigenvectors that has smallest corresponding eigenvalues. Then if you multiply your data with that eliminated eigenvector matrix (V) you get a lower dimension.