View License

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from
Improved Nystrom Kernel Low-rank Approximation

5.0 | 1 rating Rate this file 19 Downloads (last 30 days) File Size: 181 KB File ID: #38422 Version: 1.0
image thumbnail

Improved Nystrom Kernel Low-rank Approximation


Kai (view profile)


efficient, self-complete implementation of improved Nystrom low-rank approximation

| Watch this File

File Information

Matlab Package for Improved Nystrom Low-rank Approximation
Kai Zhang {}


This package is a matlab implementation of the improved Nystrom Low-rank approximation that is widely used in large scale machine learning and data mining problems. The package does not require any specific function, toolbox, or library.

The Improved Nystrom method uses K-means clustering centers as the landmark points, which can greatly improve approximation quality of the kernel matrix. Efficiency of the kmeans step therefore should be taken into account. Considering that the kmeans function in matlab is very inefficient, I wrote a faster one (eff_kmeans.m); the number of K-means iterations should be a small number, e.g. 5.

The main features of this LLSVM include:
1. Improved Nystrom low-rank approximation of kernel matrix (RBF, linear, or polynomial kernel)
2. Large scale KPCA, Laplacian Eigenmap / spectral clustering, and MDS via improved Nystrom low-rank approximation
If you have any suggestion or bug findings, please email to Thank you!


Kai Zhang, Ivor W. Tsang, James T. Kwok. Improved Nystrom Low Rank Approximation and Error Analysis. In the 25th International Conference on Machine Learning (ICML 2008), Helsinki, Finland, June 2008.

Kai Zhang and James T. Kwok, Clustered Nyström Method for Large Scale Manifold Learning and Dimension Reduction, IEEE Transactions on Neural Networks, Volume: 21 , Issue: 10 Page(s): 1576 - 1587, 2010.

Copyright: All rights reserved by the author.

How to Use
Usage: simply run demo.m to see the results.

MATLAB release MATLAB 6.5 (R13)
Other requirements no special requirements
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (2)
10 Jul 2015 Douglas

Hello Kai.

Please, could you check if in function INys_KPCA() you are really implementing proposition 1?

I think in the second line of the code sample above is missing the L^(-1/2). So, this line should be V = G * U * L^(-1/2). Am I wrong?

1-[U,L] = eig(G'*G);
2-V = G * U;
3-[va, dex] = sort(diag(L),'descend');
4-V = V(:, dex);

Thank you for the code and attention.

Comment only
10 Dec 2012 Joao Henriques

Joao Henriques (view profile)

Works as advertised! To train a linear SVM in the Nystrom feature-space, replace the computation of G and Ktilde at the end of INys() with:

M = Ve(:,pidx) * inVa;
Mdata = E * M;

then train SVM with Mdata (which has all training samples in the new feature space). To compute the same features for a test vector z, use:

Mz = exp(-sqdist(z', center')/kernel.para) * M;

and classify with the same linear SVM.

Contact us