The Gaussian Kernel can be changed to any desired kernel. However such a change will not dramatically improve results. This is a variant of ridge regression using the kernel trick (Mercers Theorem).
Ambarish Jash (2020). Kernel Ridge Regression (https://www.mathworks.com/matlabcentral/fileexchange/27248-kernel-ridge-regression), MATLAB Central File Exchange. Retrieved .
If u normalized ahead of this, x_in slightly different from you've seen before.
Thanks, it helped a lot for a good starting point in KKR.
I used it and it worked well up to now. But, not very efficient for large dataset.
1) Why not define something like
so that the loop of lines 40-42 can be simply removed if the calculation of the Kernel matrix of lines 33-38 is modified slightly as to not touch the diagonal since we know it is going to be 1 anyway.
2) Line 50 to obtain the alpha's uses the inv() function that is really slow and according to Matlab should basically be avoided as much as possible http://blogs.mathworks.com/loren/2007/05/16/purpose-of-inv/
Using something like
Klam = x_in + lamda*eye(size(x_in));
alpha = Klam\out_data;
makes it much much faster.
3)It is clearly possible to modify the multiples loops to calculate the Kernel matrix and those when calculating final_ans so that to use only one loop. I came up with one possibility, really not sure it's the best (hence why I do not tell it here), but works for what I do. And, it enables me to consider for example in_data of size 2x10000 on my Laptop which is really really long in the original implementation (both because of the multiples loop use and the inv() use).
Thank again for the function
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!