File Exchange

image thumbnail

Improved Nystrom Kernel Low-rank Approximation

version 1.0 (181 KB) by

efficient, self-complete implementation of improved Nystrom low-rank approximation

11 Downloads

Updated

View License

Matlab Package for Improved Nystrom Low-rank Approximation
Kai Zhang {zk1980@hotmail.com}

Introduction

This package is a matlab implementation of the improved Nystrom Low-rank approximation that is widely used in large scale machine learning and data mining problems. The package does not require any specific function, toolbox, or library.

The Improved Nystrom method uses K-means clustering centers as the landmark points, which can greatly improve approximation quality of the kernel matrix. Efficiency of the kmeans step therefore should be taken into account. Considering that the kmeans function in matlab is very inefficient, I wrote a faster one (eff_kmeans.m); the number of K-means iterations should be a small number, e.g. 5.

The main features of this LLSVM include:
1. Improved Nystrom low-rank approximation of kernel matrix (RBF, linear, or polynomial kernel)
2. Large scale KPCA, Laplacian Eigenmap / spectral clustering, and MDS via improved Nystrom low-rank approximation
If you have any suggestion or bug findings, please email to zk1980@hotmail.com. Thank you!

Citations

Kai Zhang, Ivor W. Tsang, James T. Kwok. Improved Nystrom Low Rank Approximation and Error Analysis. In the 25th International Conference on Machine Learning (ICML 2008), Helsinki, Finland, June 2008.

Kai Zhang and James T. Kwok, Clustered Nyström Method for Large Scale Manifold Learning and Dimension Reduction, IEEE Transactions on Neural Networks, Volume: 21 , Issue: 10 Page(s): 1576 - 1587, 2010.

Copyright: All rights reserved by the author.

How to Use
Usage: simply run demo.m to see the results.

Comments and Ratings (2)

Douglas

Hello Kai.

Please, could you check if in function INys_KPCA() you are really implementing proposition 1?

I think in the second line of the code sample above is missing the L^(-1/2). So, this line should be V = G * U * L^(-1/2). Am I wrong?

1-[U,L] = eig(G'*G);
2-V = G * U;
3-[va, dex] = sort(diag(L),'descend');
4-V = V(:, dex);

Thank you for the code and attention.

Joao Henriques

Joao Henriques (view profile)

Works as advertised! To train a linear SVM in the Nystrom feature-space, replace the computation of G and Ktilde at the end of INys() with:

M = Ve(:,pidx) * inVa;
Mdata = E * M;

then train SVM with Mdata (which has all training samples in the new feature space). To compute the same features for a test vector z, use:

Mz = exp(-sqdist(z', center')/kernel.para) * M;

and classify with the same linear SVM.

MATLAB Release
MATLAB 6.5 (R13)

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Win prizes and improve your MATLAB skills

Play today