Hi I want to normalize the orientation of Hand images.
I thought I could find centroid of the mass and use image rotation. But for that I need to threshold the image. Due to the lighting in the image, it looks like this:
how do I separate the foreground from the background. I tried activecontouring but since the hand is not in the same position in all the images, I was unsuccessful. Can you suggest any methods?
Are all the hands going to have all 5 fingers extended like that? What do you want to do if some fingers in one photo have different spacing than in another photo? Why does the orientation even need to be "normalized" in the first place?
Probably the easiest way to get the hand is to subtract the "blank field" image from the hand image and look for differences. Then you might get shadows so you need to convert to hsv color space with rgb2hsv() and mask out pixels with high v and low s and hues that are not proper. Then fill the blob to get rid of any holes in the hand. Then extract the biggest blob only to get the hand and not any noise or left over shadows.
Or check out the papers here for lots of methods: http://iris.usc.edu/Vision-Notes/bibliography/contentspeople.html#Face%20Recognition,%20Detection,%20Tracking,%20Gesture%20Recognition,%20Fingerprints,%20Biometrics