MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn moreOpportunities for recent engineering grads.

Apply Today
Asked by Alex on 19 Mar 2013

I know that if one subtracts one image from another one then noise is additive and signal of one image is subtracted from that of another one. But how do I go about tracking SNR change if I divide one image (light frame) by another one (flat frame) for normalization? I would be particularly grateful for the formulas: what happens to signal and noise?

Thanks so much in advance!

Answer by Image Analyst on 20 Mar 2013

Edited by Image Analyst on 20 Mar 2013

What you say about noise is sort of true but not exactly. The pdf of noise for two added signals that have independent noise is the convolution of the two pdfs. For a Gaussian distribution this means that the variance of the sum is the sum of the two variances. For other distributions, you'd need to convolve their pdf's to see the distribution, but it will be wider. (By the way, this is the basis of "The Central Limit Theorem" that states that if you add enough noise sources they will approach a Gaussian no matter what their individual pdf's look like.) If the signals are not independent, like if you got your background signal from blurring your main image and you subtract it from your main image, then the noise actually is less. You can prove this mathematically and is the basis for unsharp masking.

For image division, the noise amplification depends on the amount you divide by. In the middle of the image, you won't introduce much noise but at the edges where you have more shading and vignetting, you will get more noise amplification because you have to multiply the image by a bigger factor there to flatten it. So the overall noise amplification very much depends on the shape of the "flat" background image. I can use a crummy $160 lens and get a lot of shading, or a so-so $700 Canon lens and get a fair amount of shading, or an excellent $1700 Schneider lens and get almost no shading (true examples from my experience). So your best bet is to just measure and see since the math to predict it depends on the amount of shading and that varies pixel by pixel.

Alex on 20 Mar 2013

Thank you so much for your reply. I am sorry that I was not very specific. The goal is to study S/N propagation during image processing in X-ray microscope where absorption spectrum represents the next dependence: -log((I(light frame)-I(dark frame))/(I(flat frame)-I(dark frame)) as a function of energy edge of elements.

I have S/N values for I(light frame), I(dark frame) and I(flat frame) and use the formulas from here http://www.samfahmie.com/book/export/html/5 to calculate new S/N values after the subtraction: S1/N1 after I(light frame)-I(dark frame) and S2/N2 after I(flat frame)-I(dark frame). But I am unsuccessful in figuring out what happens to the final S/N after the division - (I(light frame)-I(dark frame))/(I(flat frame)-I(dark frame)?

Someone told me that in order to get SNR after the normalization I have to multiply the noise N1*N2 and the signal S1*S2. Thus the final S/N=(S1*S2)/(N1*N2)? Do you think it is correct???

## 0 Comments