standard deviation and mean of image

4 views (last 30 days)
I have an image A and its corresponding glcm matrix named GLCM. which one of the two matrices will provide me with the correct standard deviation and mean of the image?
I know what GLCM is and how it works but finding standard deviation from GLCM matrix gives a very high value. While if i find standard deviation from the image matrix itself it is a relatively smaller value. So which one of the two shows the actual standard deviation of the image? Same goes for the mean of the image.

Accepted Answer

Jeff E
Jeff E on 1 May 2015
I'm assuming you are interested in the mean and S.D. of the intensity in the image. In which case, you would just take the mean (mean2) and S.D. (std2) of the image.
The GLCM matrix is a measure of texture. I don't see how the GLCM would give you any meaningful information about the distribution of the pixel intensities.
If you're interested in some other measure of the image that is not intensity, you'll need to elaborate.
  2 Comments
farheen asdf
farheen asdf on 15 Jun 2015
yes i am talking about the image intensity. So i should use the image matrix to find mean and SD? i also wanted to know about the variance of the image. I think i should use the image matrix to find that too. Correct?
Image Analyst
Image Analyst on 15 Jun 2015
Yes, and yes.
theVariance = var(grayImage(:));

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!