Updated 30 Aug 2018
Run the Example_FindFace.m script for a walk-through of how to use the code. The repository contains a shape model and a gray-level model trained on images from the data set listed below, as well as a single example face. The repository includes code for manually labeling new images and training new shape and gray-level models, meaning it can be used for more than face detection if trained properly. I'd be happy for any feedback you may have. Enjoy!
Version 2.0 supports the MUCT landmark arrangement:
The simple landmark arrangement with I labeled is still supported. The faces I used for manual labeling are available here:
If you have a question or suggestion for the project, please open an Issue on GitHub. I will be much more likely to see your question there.
John W. Miller (2020). Face detection with Active Shape Models (ASMs) (https://www.github.com/johnwmillr/ActiveShapeModels), GitHub. Retrieved .
I have ti use your code on biovid heat pain database images for landmarkmark detection. Please suggest how we access face lamdmark values??
Hi Iqra Rashia,
The best advice I can give you is to follow the instructions laid out in the "Example_buildMuctModel.m" script. Here is a direct link: https://github.com/johnwmillr/ActiveShapeModels/blob/master/Example_buildMuctModel.m
Essentially, if you want to train a model on a new face database, you need to train both a new shape model (with the new landmark arrangement) and a new gray-level model. There is an example for both of those models in the script I linked to above.
If you have additional questions or issues, use GitHub: https://github.com/johnwmillr/ActiveShapeModels/issues
Thank you for the code.
I have a question. May I know how do I train with another set of face samples if I want to use another face database for training? Your help is much appreciated. Thank you. You can email me at firstname.lastname@example.org
Hi syahdan edy murad,
Sorry for the late response, I only now saw your question. It's possible you could train a classifier to use the shape of the hair's outline to predict gender, but I wouldn't expect great performance. You could start by manually labeling the faces in the MUCT dataset.
Hi, is there any way i could extract hair feature for gender classification?
I want to extract facial point result to excel
What result do you want to extract? Your best bet is to use Matlab's xlswrite() function.
You use the placeLandmarks.m function to manually assign landmarks to a new set of training images.
I'll send you an email. I hope that helps!
Thank Your For The Code..
How to extract the result in excel file?
Thank you for the code.
I have a question. May I know how do I train with another set of face samples if I want to use another face database for training? Your help is much appreciated. Thank you. You can email me at C.How_93@hotmail.my
Please send me
( sim1.mat ) file
Not sure if I understand your comment. Were there missing dependencies that prevented you from running my ASM code from GitHub? If so, let me know what's missing, and I'll add them to the repo!
FEX submission requires an "enthusiasm for" dependencies. Maybe you want to update yours(?)
Added a note about opening Issues on GitHub.
Updated description. MUCT landmarks now supported.
Version 2.0 supports the MUCT landmark arrangement.
Changed title to "Face detection with Active Shape Models (ASMs)."