# Bad Image Rectification after Stereo Calibration and Image Rectification (From 2D to 3D)

24 views (last 30 days)
Jan Bienstman on 11 Apr 2022
Hi,
I'm not new to matlab (> 20 y experience), but new to 3D vision and the stereo calibration (app).
Here is what I have: A setup with 2 camera's (FLIR) at about 60 .. 70 cm above a plate where a sample is photographed.
Setup:
Pictures:
I have taken pictures from the checkerboard pattern following the guideline: Here some samples together with the checkerboard pattern.
Problem:
Using the "Stereo Camera Calibrator", I can feed it with good quality pictures from the checkerboard pattern.
the program can nicely identify the the control points:
However, when I show the rectified view, it shows the following:
... which at first sight showed OK to me (the projection of both images is OK with respect to the horizontal lines),
but, one picture is shown completely left and the other completely right.
Which results in (using stereoAnaglyph)
What am I doing wrong?
I thought this setup is quite controlled, with fixed camera position (and ability to measure distances and angels). Is there a way to feed the "Stereo Camera Calibration" algorithm with more inputs (these known distances?) and do the optimization using this preset?
Looking forward to your suggestions,
Thanks,
Jan
##### 2 CommentsShowHide 1 older comment
Jan Bienstman on 11 Apr 2022
Hi Kalyan,
Thanks!
I'm digging into the documentation. I can set Fixed intrinsics:
(intrinsics = cameraIntrinsics(focalLength,principalPoint,imageSize) because I know them (camera properties etc. ..)
However, no improvement yet.
I tried with estimateCameraParameters
Same results ...
I fear it has to do with a "too good" setup (meaning the two checkerboard images look too identical) so that the algorithm creates ill defined matrices ... though I believe that a stereo system with 2 camera's 25cm appart with a object distace of 60 ..70 cm would be reasonable. (Focal lenght = 20mm) (Compare with the eye.)
I guess/fear I better compute the params directly from my known geometry. But what is the value of the calibration app then...
OK, still very eager the learn how to do better ;-)
BR.
Jan

Sign in to comment.

### Answers (3)

Benjamin Thompson on 11 Apr 2022
If you are using this function, note its description in the documentation:
[J1,J2] = rectifyStereoImages(I1,I2,stereoParams) returns undistorted and rectified versions of I1 and I2 input images using the stereo parameters stored in the stereoParams object.
Stereo image rectification projects images onto a common image plane in such a way that the corresponding points have the same row coordinates. This image projection makes the image appear as though the two cameras are parallel. Use the disparityBM or disparitySGM functions to compute a disparity map from the rectified images for 3-D scene reconstruction.
So if the cameras are far apart and not pointing in the same direction, you should expect a greater amount of adjustment in the images. If you then use one of the disparity functions the results may make more sense.
Also, doing calibrations with a smaller checkerboard that is moved to different regions of the shared camera viewing space gives better calibration results. The app requires multiple images do even start calibration, and those images should be with the board in different spots, sampled at the same time by both cameras.
##### 1 CommentShowHide None
Jan Bienstman on 11 Apr 2022
Good suggestions. Thanks!

Sign in to comment.

Giridharan Kumaravelu on 15 Apr 2022
Edited: Giridharan Kumaravelu on 15 Apr 2022
I agree with Benjamin here on the number of calibration images used in the calibrator app. Two image pairs looking similar in orientation are not enough for calibrating this stereo system.
Try capturing alteast 10 image pairs where the orientation of the checkboards are different like mentioned here: Prepare Camera and Capture Images.
##### 0 CommentsShowHide -1 older comments

Sign in to comment.

Jan Bienstman on 20 Apr 2022
Hi,
I made three new series of pictures from the checkerboard pattern, each with ~7 .. 8 pictures. Now with the checkerboard in different orientations like suggested ( Prepare Camera and Capture Images.). The size of the checkerboard is different per series. See Figure 1 with (assembled) snapshot of Left camera pictures.
Figure 1: three seriess of calibration images
However, the result is the same. (See Figure 2): No overlap in "Show rectified" and in "stereoAnaglyph".
The y-coordinates are matching, but the x-coordinates are conpletely "exclusive".
Figure 2: Result after calibration & Show rectified: Corresponding points have same y-coordinate, but wrong x-coordinate (no overlap between images, which results in a non-overlapping stereoAnaglyph).
What I notice is that the computed camera positions looks relative OK (it matches the different patterns in one plane), but that the distance between camera's and checkerboard patterns is too low: computed about 350mm, in reality about 700mm. (See figure 3)
Figure 3: computed camera positions (camera centric). Compare with forts image of real setup in beginning of thread.
The only thing which is different from Prepare Camera and Capture Images is that all my pictures are in the same plane.
Could that be a reason?
Alternatively: is there another method to directly compute the stereoParameters just using known distances, angles and mathematics? Is there amore in-depth description on the used model(s) inside?
Thanks in advance,
Jan
##### 1 CommentShowHide None
Giridharan Kumaravelu on 20 Apr 2022
Hello Jan,
Here are few suggestions based on your description:
1. For the series of calibration images that you used in figure 2, you could try removing the image pair 1 as both the images produce high reprojection errors (> 1 pixel error). You could right click that image thumbnail and select "Remove and Recalibrate" to see if that improves the result.
2. You are correct, capturing all images of the calibration pattern in the same plane could be one of the reasons for this poor result.
3. For wide baseline stereo systems like your setup, it is sometimes best to calibrate the two cameras individually to produce the cameraParameters objects and then use these as fixed intrinsics and estimate baseline. That is,
• Use the camera calibrator app, to calibrate the two cameras in two sessions and export the parameters to workspace as cameraParamsLeft and cameraParamsRight.
• In the stereo camera calibrator app, after loading your images you can choose to "Use Fixed intrinsics" on the toolbar and load these intrinsics from workspace that were exported in the previous step. Calibrate the stereo camera with these loaded instrinsics.
To answer you question for the alternative workflow:
1. You can create the stereoParameters object with known distances and angles from the MATLAB commandline using second syntax shown here: https://www.mathworks.com/help/vision/ref/stereoparameters.html?s_tid=doc_ta#d123e126627
2. The following page explains the pinhole model and the distortion model used inside: https://www.mathworks.com/help/vision/ug/camera-calibration.html
Hope this helps,
Giridharan

Sign in to comment.

R2022a

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!