# X, Y and Z co-ordinates from 3D point cloud are not accurate. Showing considerable error in X co-ordinates.

11 views (last 30 days)
Naseeb Gill on 2 Feb 2016
Edited: Dima Lisin on 15 Feb 2016
I reconstruct 3D scene using MATLAB 2014b and using point cloud I have to find out the X,Y and Z co-ordinates of white object in scene. For this, I find out centroids of white objects in Image 1 (I1)and using that values (which are in pixels off-course) I extract X, Y and Z values. But when I compared these extract values with actual values (which I measured with scale from left camera or camera 1), there was huge difference in values. Then I notice that my actual image I1 is of 480x640 while my rectified image J1 is of 442x604. Then just for trial, I found the centroid of white object in J1 (using impixelinfo) and extract the values. These values are far better then I get using I1, but still there is an avg. error of 7.31 cm in X, 0.5 cm in Y and 4.0 cm in Z. I can tolerate 0.5 cm error, so Y co-ordinate is not problem but in X and Z co-ordinates I want more accuracy (with preferable image I1 because I have code to get centroids using I1 not J1). I calibrate my stereo camera many times and still the same problem. One more point I notice that as distance along X-axis from left camera increases my accuracy increases. I can't understand, why? I'm attaching my stereoParams and excel files showing error with this.

Dima Lisin on 2 Feb 2016
Hi Naseeb,
First, based on what you wrote, I would guess that you may be confusing camera 1 and camera 2. According to your stereoParams object, your cameras are 9.1 cm apart. So if you are measuring locations of objects relative to camera 2 instead of relative to camera 1 that would explain most of the error in X.
Second, please keep in mind that the world coordinate system (assuming that you are using disparity() and reconstructScene() has its origin at the principal point of camera 1, which is located somewhere inside the camera casing. So that becomes hard to measure. A better test of accuracy of the reconstruction would be to measure a distance between two points in the world.
So, please check that you don't have camera 1 and camera 2 mixed up. If that does not help, could you please post a sample pair of images? Also, please clarify how exactly you are doing the reconstruction. Are you calling rectifyStereoImages followed by disparity followed by reconstructScene? If so, then of course you should be using the rectified images to look up the 3D world coordinates.
Dima Lisin on 15 Feb 2016
Hi Naseeb,
I tried running your code, and I plotted your TrasnsformedPoints using plot3. Here's what they look like if you look from the "top":
And here's what they look like from the "side":
In other words you have a nice reconstruction in X and Y, but you have errors in Z of 2-3cm. So, what you are doing is correct, but your reconstruction has errors.
If these errors are unacceptable, then there are a few things you can do. One is to use a higher resolution camera. Another thing that may help, is to measure the size of the checkerboard square more precisely using a caliper. If you can measure it with the accuracy of .1mm rather than 1mm, that can improve the accuracy of the reconstruction.
You can also try improving the calibration accuracy by using even more calibration images. I would also try using a larger checkerboard, which may improve corner localization accuracy.
Moving the cameras a little farther apart can generally help, but probably not in this case, because your object of interest is very close to the cameras.
Also, I do not know what your stereo camera rig looks like. You absolutely have to make sure that the cameras do not move relative to each other. Even a slight shake will cause noticeable reconstruction errors.
Another thought: your checkerboard looks a bit wavy. Try making it as flat as you can. For example, glue it to a flat piece of plastic.