Code covered by the BSD License  

Highlights from
Improved Nystrom Kernel Low-rank Approximation

5.0

5.0 | 1 rating Rate this file 45 Downloads (last 30 days) File Size: 181 KB File ID: #38422
image thumbnail

Improved Nystrom Kernel Low-rank Approximation

by

 

efficient, self-complete implementation of improved Nystrom low-rank approximation

| Watch this File

File Information
Description

Matlab Package for Improved Nystrom Low-rank Approximation
Kai Zhang {zk1980@hotmail.com}

Introduction

This package is a matlab implementation of the improved Nystrom Low-rank approximation that is widely used in large scale machine learning and data mining problems. The package does not require any specific function, toolbox, or library.

The Improved Nystrom method uses K-means clustering centers as the landmark points, which can greatly improve approximation quality of the kernel matrix. Efficiency of the kmeans step therefore should be taken into account. Considering that the kmeans function in matlab is very inefficient, I wrote a faster one (eff_kmeans.m); the number of K-means iterations should be a small number, e.g. 5.

The main features of this LLSVM include:
1. Improved Nystrom low-rank approximation of kernel matrix (RBF, linear, or polynomial kernel)
2. Large scale KPCA, Laplacian Eigenmap / spectral clustering, and MDS via improved Nystrom low-rank approximation
If you have any suggestion or bug findings, please email to zk1980@hotmail.com. Thank you!

Citations

Kai Zhang, Ivor W. Tsang, James T. Kwok. Improved Nystrom Low Rank Approximation and Error Analysis. In the 25th International Conference on Machine Learning (ICML 2008), Helsinki, Finland, June 2008.

Kai Zhang and James T. Kwok, Clustered Nyström Method for Large Scale Manifold Learning and Dimension Reduction, IEEE Transactions on Neural Networks, Volume: 21 , Issue: 10 Page(s): 1576 - 1587, 2010.

Copyright: All rights reserved by the author.

How to Use
Usage: simply run demo.m to see the results.

MATLAB release MATLAB 6.5 (R13)
Other requirements no special requirements
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (1)
10 Dec 2012 Joao Henriques

Works as advertised! To train a linear SVM in the Nystrom feature-space, replace the computation of G and Ktilde at the end of INys() with:

M = Ve(:,pidx) * inVa;
Mdata = E * M;

then train SVM with Mdata (which has all training samples in the new feature space). To compute the same features for a test vector z, use:

Mz = exp(-sqdist(z', center')/kernel.para) * M;

and classify with the same linear SVM.

Contact us