Bad Image Rectification after Stereo Calibration and Image Rectification (From 2D to 3D)

62 views (last 30 days)
Hi,
I'm not new to matlab (> 20 y experience), but new to 3D vision and the stereo calibration (app).
Here is what I have: A setup with 2 camera's (FLIR) at about 60 .. 70 cm above a plate where a sample is photographed.
Setup:
Pictures:
I have taken pictures from the checkerboard pattern following the guideline: Here some samples together with the checkerboard pattern.
Problem:
Using the "Stereo Camera Calibrator", I can feed it with good quality pictures from the checkerboard pattern.
the program can nicely identify the the control points:
However, when I show the rectified view, it shows the following:
... which at first sight showed OK to me (the projection of both images is OK with respect to the horizontal lines),
but, one picture is shown completely left and the other completely right.
Which results in (using stereoAnaglyph)
What am I doing wrong?
I thought this setup is quite controlled, with fixed camera position (and ability to measure distances and angels). Is there a way to feed the "Stereo Camera Calibration" algorithm with more inputs (these known distances?) and do the optimization using this preset?
Looking forward to your suggestions,
Thanks,
Jan
  2 Comments
KALYAN ACHARJYA
KALYAN ACHARJYA on 11 Apr 2022
Very good, quite detailed question, but it is more towards conceptual/technical rather than MATLAB, hope respected members can add advices.
Jan Bienstman
Jan Bienstman on 11 Apr 2022
Hi Kalyan,
Thanks!
I'm digging into the documentation. I can set Fixed intrinsics:
(intrinsics = cameraIntrinsics(focalLength,principalPoint,imageSize) because I know them (camera properties etc. ..)
However, no improvement yet.
I tried with estimateCameraParameters
Same results ...
I fear it has to do with a "too good" setup (meaning the two checkerboard images look too identical) so that the algorithm creates ill defined matrices ... though I believe that a stereo system with 2 camera's 25cm appart with a object distace of 60 ..70 cm would be reasonable. (Focal lenght = 20mm) (Compare with the eye.)
I guess/fear I better compute the params directly from my known geometry. But what is the value of the calibration app then...
OK, still very eager the learn how to do better ;-)
BR.
Jan

Sign in to comment.

Answers (4)

hongliang
hongliang on 22 Feb 2024
Hello,you don't hve to worry at all, the calibration you get is fairly fine.Reason for the separation is that your system's disparity is too large. I met the same problem recently and has solved it.
This is my calibration result:
My results (using stereoAnaglyph):
You can measure the disparity roughly by the imageViewer app in image processing toolbox in matlab:
you can see the disparity range is about 500-550, in this case ,you cannot use disparityBM to generate disparityMap,for it's maximum disparityRange lays within [0 256]. You should use disparitySGM instead,for there is no up upper limit for disparityRange. The only requirement is that the difference should below 128;
This is what disparitySGM function generate:
This is the corresponding 3d reconstruction for the image:
You can see the result is fine.
Instead, if you use the disparityBM function regardless of the large disparity, you will get this results:
this is impossible for 3d reconstruction.
So,there is no need to worry about the large disparity and just choose the right function to continue processing it.
However, i have run the same calibration and SGM match use opencv functions in visual studio, the result is fairly normal,there is no such long black bars and the image was not extended so much. This is strange
  1 Comment
Suryaansh Rathinam
Suryaansh Rathinam on 22 Mar 2024 at 5:12
Hi @hongliangI am working on a stereo calliberation and depth mapping problem and I am facing the following error: Unable to estimate camera parameters. The 3-D orientations of the calibration pattern might be too similar across images. Remove any similar images and recalibrate the cameras again.
Could we please connect and could you help me if possible: suryaansh2002@gmail.com
Thank you

Sign in to comment.


Benjamin Thompson
Benjamin Thompson on 11 Apr 2022
If you are using this function, note its description in the documentation:
[J1,J2] = rectifyStereoImages(I1,I2,stereoParams) returns undistorted and rectified versions of I1 and I2 input images using the stereo parameters stored in the stereoParams object.
Stereo image rectification projects images onto a common image plane in such a way that the corresponding points have the same row coordinates. This image projection makes the image appear as though the two cameras are parallel. Use the disparityBM or disparitySGM functions to compute a disparity map from the rectified images for 3-D scene reconstruction.
So if the cameras are far apart and not pointing in the same direction, you should expect a greater amount of adjustment in the images. If you then use one of the disparity functions the results may make more sense.
Also, doing calibrations with a smaller checkerboard that is moved to different regions of the shared camera viewing space gives better calibration results. The app requires multiple images do even start calibration, and those images should be with the board in different spots, sampled at the same time by both cameras.

Giridharan Kumaravelu
Giridharan Kumaravelu on 15 Apr 2022
Edited: Giridharan Kumaravelu on 15 Apr 2022
I agree with Benjamin here on the number of calibration images used in the calibrator app. Two image pairs looking similar in orientation are not enough for calibrating this stereo system.
Try capturing alteast 10 image pairs where the orientation of the checkboards are different like mentioned here: Prepare Camera and Capture Images.

Jan Bienstman
Jan Bienstman on 20 Apr 2022
Hi,
I made three new series of pictures from the checkerboard pattern, each with ~7 .. 8 pictures. Now with the checkerboard in different orientations like suggested ( Prepare Camera and Capture Images.). The size of the checkerboard is different per series. See Figure 1 with (assembled) snapshot of Left camera pictures.
Figure 1: three seriess of calibration images
However, the result is the same. (See Figure 2): No overlap in "Show rectified" and in "stereoAnaglyph".
The y-coordinates are matching, but the x-coordinates are conpletely "exclusive".
Figure 2: Result after calibration & Show rectified: Corresponding points have same y-coordinate, but wrong x-coordinate (no overlap between images, which results in a non-overlapping stereoAnaglyph).
What I notice is that the computed camera positions looks relative OK (it matches the different patterns in one plane), but that the distance between camera's and checkerboard patterns is too low: computed about 350mm, in reality about 700mm. (See figure 3)
Figure 3: computed camera positions (camera centric). Compare with forts image of real setup in beginning of thread.
The only thing which is different from Prepare Camera and Capture Images is that all my pictures are in the same plane.
Could that be a reason?
Alternatively: is there another method to directly compute the stereoParameters just using known distances, angles and mathematics? Is there amore in-depth description on the used model(s) inside?
Thanks in advance,
Jan
  1 Comment
Giridharan Kumaravelu
Giridharan Kumaravelu on 20 Apr 2022
Hello Jan,
Here are few suggestions based on your description:
  1. For the series of calibration images that you used in figure 2, you could try removing the image pair 1 as both the images produce high reprojection errors (> 1 pixel error). You could right click that image thumbnail and select "Remove and Recalibrate" to see if that improves the result.
  2. You are correct, capturing all images of the calibration pattern in the same plane could be one of the reasons for this poor result.
  3. For wide baseline stereo systems like your setup, it is sometimes best to calibrate the two cameras individually to produce the cameraParameters objects and then use these as fixed intrinsics and estimate baseline. That is,
  • Use the camera calibrator app, to calibrate the two cameras in two sessions and export the parameters to workspace as cameraParamsLeft and cameraParamsRight.
  • In the stereo camera calibrator app, after loading your images you can choose to "Use Fixed intrinsics" on the toolbar and load these intrinsics from workspace that were exported in the previous step. Calibrate the stereo camera with these loaded instrinsics.
To answer you question for the alternative workflow:
  1. You can create the stereoParameters object with known distances and angles from the MATLAB commandline using second syntax shown here: https://www.mathworks.com/help/vision/ref/stereoparameters.html?s_tid=doc_ta#d123e126627
  2. The following page explains the pinhole model and the distortion model used inside: https://www.mathworks.com/help/vision/ug/camera-calibration.html
Hope this helps,
Giridharan

Sign in to comment.

Products


Release

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!