Finding corresponding features in a pair of images is the basis of many optic flow, stereo vision and image registration algorithms. One straightforward approach to finding a match is to take a small patch of one image, compute its sliding cross-correlation with the other image, and find a peak. This submission supplies a class which implements this method.
There are many other ways of finding correspondences, but normalised cross-correlation is relatively easy to understand, and fairly effective if a sparse set of matches is sufficient and the change in viewpoint is not too large. This submission is intended largely as a learning aid, though it may be usable for some applications. A demonstration script is therefore included.
The algorithm extends the basic idea in two ways. First, a reverse match may be done on each feature pair to test for consistency; this eliminates many incorrect matches. Second, the correlation computation may be greatly speeded up using the SVD trick implemented in convolve2, at the cost of some accuracy and density of matches.
The core of the submission is object-oriented to facilitate its use with image sequences and to allow efficient reuse of parts of the computation.
WARNING -- this package contains a file called 'findpeaks.m'. If you are using the Signal Processing toolbox, this will conflict with MathWorks' own .m file, findpeaks.m. This caused me a bit of a headache to figure out, and I couldn't find any other solutions online. Anyways, I got around it by renaming the file in this package to 'findpeakscc.m' and going through the files looking to find/replace any instances of findpeaks. The only one I found was in varPeaks.m, so hopefully if you change the function name, the file name, and the name of the called function in varPeaks.m, everything should be restored to working order.
Is there anyway to keep track of which points are lost when a match is not found? For example, it seems to me that by default if you're tracking five points, in a 2x5 table, that when one is lost, the table shrinks to 2 x 4. If this lost point was in the middle, all of the columns shift left to fill in the gap. For my application, though, I need to know where in the table that point dropped out so I can re-select its location manually and resume tracking its location. Is there a way within the code to make this happen? What would your approach to the problem be?
Thanks! Regardless, even if this issue can't easily be resolved, this is a very well done and extraordinarily useful package!
sush: Maybe you have a file called edge.m on your path that is hiding the toolbox edge function. What is the output of the command "which edge"? If the result isn't in the Image Processing Toolbox, you could try renaming your edge.m file.
its showing error dat "attemp to call edge as script function " why ? ?
farah: Yes, you can use another set of images. You'll see that the images are read from disk at the start of the demo, and you can change those lines to read different images. However, it will be better in the end to write your own code to call correlCorresp rather than relying on the demo. If you are not sure how to proceed, you may need to work through "Getting Started with Matlab" in the documentation.
MD JAYED Hussan: The corresponding coordinates are held in the corresps property of a correlCorresp object, after a call to the findCorresps method. You can see how the vectors are computed by inspecting the code for correspDisplay. It's all in the documentation, by the way.
The code is working great. Can point me where i need to modify it to get the length of the blue lines so that i have calculate the velocity of shifting between two slides when i have a frame rate.
cant I use another set of Images ?If yes then what change is required in which part of the code
katia - sorry not to get back sooner. I hope you've solved the problem by now. It's easy - just look at the code that does the display for the demo and modify it to do what you want.
Is there any way to display the feature points in every one of the two images before showing the correspondence between them?
vamsi - It is possible to get an intermediate view using these correspondences, but how successful it will be depends on the complexity of the flow field. Extending the demo in the FEX package, you can do something like this:
x1 = cc.corresps(1,:);
y1 = cc.corresps(2,:);
x2 = cc.corresps(3,:);
y2 = cc.corresps(4,:);
input_points = [x1' y1'];
base_points = [(x1+x2)' (y1+y2)']/2; % half way between matchine features
% Generate transform from image1 to half-way position
tform = cp2tform(input_points, base_points, 'piecewise linear');
% Apply it to image 1
interp = imtransform(image1, tform, 'Xdata', [1 size(image1,2)], 'Ydata', [1 size(image1,1)]);
to get an image intermediate between the two images. I've found that for complex flow fields a piecewise linear transform can't be found, but the lwm option can work. I can't make a recommendation though - you have to experiment with your own data set.
david-I am trying to get an intermediate view between these two images with out disparity map. So i need to know the amount by which each pixel has been moved. Can i go for image morphing with these available sparse set? In such a case, which morphing technique would you suggest..
vamsi - I'm glad it's helpful. I am sorry, but I don't know the best way to make a dense set - that's a difficult problem, and there's a lot of research literature to look at. Interpolation is not good because the flow field is discontinuous, and so it's necessary to go back to the original images to do segmentation. Sorry not to be more help, but it's a big question.
Hi david. Thankyou so mcuh for this work. It really helped me a lot. After getting the correspondences, how can i make it dense set from the sparse set obtained. I tried some interpolation techniques. But they did not worked. Please help me
Marco - It's necessary to convert rgb images to greyscale (using rgb2gray).
David - So what was the resolution since I'm getting the same errors that Michael did. I am running 7.9.0 (R2009b).
Note to potential users: the issue identified by Michael was resolved between us - there isn't a problem with the code.
That's puzzling! What does "which conv2 -all" print?
Forward matches: ??? Undefined function or method 'conv2' for input arguments of
and attributes 'full 3d real'.
Error in ==> convolve2>doconv at 95
y = conv2(conv2(x, u(:,1)*s(1), shape), vp(1,:), shape);
Error in ==> convolve2 at 71
y = doconv(x, m, shape, tol);
Error in ==> patch_var at 29
a = convolve2(x, m, shape);
Error in ==> varPeaks at 17
vars = patch_var(im, patchsize);
Error in ==> correlCorresp.correlCorresp>correlCorresp.findFeatures at 466
[r, c] = varPeaks(cc.im1, cc.fPS, cc.rT);
Error in ==> correlCorresp.correlCorresp>correlCorresp.findCorrespsFwd at 545
cc = cc.findFeatures;
Error in ==> correlCorresp.correlCorresp>correlCorresp.findCorresps at 489
cc = cc.findCorrespsFwd;
Error in ==> correspDemo_1 at 38
cc = cc.findCorresps;
??? Error using ==> conv2
Not enough input arguments.
So I see conv2 but it does not seem to like it, might you know why?
Ah silly me. 2010a, right above. I will give that a go and see how if that fixes it.
I'm not sure what the problem is, but I wonder if it's a version problem - what version of Matlab are you using?
You don't need to put the files in a special "@" directory - they just go in a folder on your path as normal. But I don't think that in itself would cause the error you observe.
So I am a little new to class definitions in Matlab, but I placed correlCorresp.m in a folder @correlCorresp and modified the demo to include 2 of my .png files.
??? Error: File: /mnt/qfs4/mcoughlin/IM/daily/2010_05_01/plots/crossCorr/@correl
Corresp/correlCorresp.m Line: 190 Column: 28
Undefined function or variable 'private'.
Error in ==> correspDemo_1 at 24
cc = correlCorresp('image1', image1, 'image2', image2, 'printProgress', 100);
So that means it is complaining about this line:
properties (Dependent, SetAccess = private)
Do you happen to know what I am doing incorrectly?
Thank you for your help,
Thanks, Ulrich. I don't have a publication on this code, so no real reference, though the SVD trick that speeds up the correlations is described here: 'Computer and Robot Vision' Vol I, by R.M. Haralick and L.G. Shapiro (Addison-Wesley 1992), pp. 298-299.
Subpixel displacements: yes, I've been thinking about this, but I haven't done it yet.
Wow, this is great. I'm trying to get similar results for landslide displacements using DEMs quite some time now. And this is faster, more reliably and gave me results straight ahead. Thanks for this. Now I have to modify it a bit for subpixel displacements. The reverse matching sort out nicely mismatches in my case. Is there any reference I can use when I include this in my thesis?
Minor updates to comments. Update to method used by findpeaks, which now calls imregionalmax.
Revised the properties that control user-specified feature locations and propagation of features in image sequences.
Code tidied; demo extended.