To resolve issues starting MATLAB on Mac OS X 10.10 (Yosemite) visit: http://www.mathworks.com/matlabcentral/answers/159016
laser-beam is projected onto an object in the field of view of a camera. This laser beam is ideally parallel to the optical axis of the camera. The dot from the laser is captured along with the rest of the scene by the camera. A simple algorithm is run over the image looking for the brightest pixels. Assuming that the laser is the brightest area of the scene the dots position in the image frame is known. Then we need to calculate the range to the object based on where along the y axis of the image this laser dot falls. The closer to the center of the image, the farther away the object is D = h /tan (theta) where thefa = pfc * rpc + ro .
can anybody help me regarding this project?
No products are associated with this question.
regionprops() can give you the equivalent-radius of a blob. But the actual radius is constant. So now what you need to consider is this: if you have two line segements of the same length, and one is N times further away than the other, then how will the angular size of the two line segments differ?
Hint: the spreading angle from the center of the beam (center of the optical axis) does not change with distance.