Learning a linear model for image classification

9 views (last 30 days)
Greetings!
I'm working on a homework project. It asks us to generate a set of binary images, each displaying a single digit represented by 10x10 pixels, e.g. "5", just like those displayed on an digital watch, however these images have to be noisy, with some random b/w dots on them. This has been accomplished.
Then it asks us to generate a vector, with random values in it at first, then use this vector "C" as a "linear model" to classify the noisy images of numbers. The vector C needs to be updated each time it classifies an image, till it stops changing its elements, thus "learning" is completed. I'm stuck at this part and have no idea about updating the vector as a "linear model". I suppose (and I may be wrong) the ultimate vector C would have ones where a perfect "8" is displayed, and zeros as the rest of all the elements.
I sincerely hope someone could help me with this. Any ideas or discussions would be highly appreciated. And please feel free to reply if anything above needs to be clarified.
Thanks, Xiao

Answers (1)

Image Analyst
Image Analyst on 16 Sep 2013
I'm not sure what the "linear" means either. How about you just compare your noisy test number/image to all the perfect number. Use logical operations like OR and AND, etc. to figure out how many pixels like inside the perfect number (true positives) and how many lie outside (false positives). Basically you're comparing your image to 10 template perfect images of the 10 possible digits. Then whichever number matches up best is the one it is. I'll leave the details up to you since it's a homework question.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!