LDA classifier problem in age classification

3 views (last 30 days)
I'm doing project on age classification using gabor features.First of all,I calculated gabor features of 4 scales and 5 orientations for each of the image of child,adolescent,adult and senior adult.I get a matrix of size 64x80.THen paper says reduce its dimension using PCA.I don't know PCA so I deleted every 2nd and 3rd row and column of matrix.I reshape the matrix columnwise and get a matrix of 594x1.I did this for 114 images(doing 10 times per image gabor feature calc on different orientations of image) so I get basically 594x1140 size matrix as input matrix.I made target matrix of 4x1140 haiving columnwise data as[1;0;0;0](child),[0;1;0;0] (adolescent) and so on. So,now I have training matrix of 594x1140 and target of 4x1140 .Next step says to do LDA classification.Sample would be a matrix of size 594x1 having gabor features of test image.Classifier should classify it in either of 4 groups-child,adolescent,adult and senior adult.Please help me with LDA classifier thing!!!I'm attaching some code where I stored gabor features in matrix:
% % % function IMVECTOR = im2vec (W16x16) % load gabor; % W16x16 = adapthisteq(W16x16,'Numtiles',[8 8]); % Features80x128 = cell(4,5); % for s = 1:4 % for j = 1:5 % Features80x128{s,j} = mminmax(abs(ifft2(G{s,j}.*fft2(double(W16x16),32,32),16,16))); % end % end % Features27x43 = cell2mat(Features80x128); % % Features27x43 (3:3:end,:)=[]; % Features27x43 (2:2:end,:)=[]; % % Features27x43 (:,3:3:end)=[]; % Features27x43 (:,2:2:end)=[]; % % % IMVECTOR = reshape (Features27x43,[594 1]); % * Item one * Item two

Accepted Answer

Greg Heath
Greg Heath on 16 May 2013
Learn PCA. QUICKLY. Use the help and doc commands
>> lookfor pca
processpca - Processes rows of matrix with principal component analysis.
prepca - Principal component analysis.
trapca - Principal component transformation.
pcacov - Principal Components Analysis (PCA) using a covariance matrix.
pcares - Residuals from a Principal Components Analysis (PCA).
princomp - Principal Components Analysis (PCA) from raw data.
Concentrate on the above functions. I don't think you need the ones below.
rotatefactors - Rotation of FA or PCA loadings.
wmspca - Multiscale Principal Component Analysis.
wmspcatool - Multisignal Principal Component Analysis GUI.
wmspcatoolmoab - MATLAB file for wmspcatoolmoab.fig
wmspcatoolmopc - MATLAB file for wmspcatoolmopc.fig
wpca - Principal Component Analysis.
dguiwmspca - Demonstrates Multivariate Wavelet PCA tool in the Wavelet Toolbox. >>
  3 Comments
Alan Weiss
Alan Weiss on 17 May 2013
If you have a relatively recent Statistics Toolbox license (R2011b or later), you can try using ClassificationDiscriminant.fit. There is documentation here for the discriminant analysis classifier. This classifier does both linear and quadratic discriminant analysis.
For earlier toolbox versions, use the classify function.
Alan Weiss
MATLAB mathematical toolbox documentation
Ilya
Ilya on 17 May 2013
Alan gave you good advice. Keep in mind though that ClassificationDiscriminant finds K*(K-1)/2 hyperplanes for K classes (each hyperplane separates a pair of classes). The excerpt from the paper implies that you may be looking for a different version of LDA, in which you find K-1 hyperplanes with most informative projections. This can be easily done using ClassificationDiscriminant as well:
load fisheriris
L = ClassificationDiscriminant.fit(meas,species);
[LTrans,Lambda] = eig(L.BetweenSigma,L.Sigma,'chol');
[Lambda,sorted] = sort(diag(Lambda),'descend') % sort by eigenvalues
LTrans = LTrans(:,sorted);
LTrans(:,[3 4]) = [] % get rid of zero eigenvalues
Xtransformed = L.XCentered*LTrans;
Xtransformed represents data mapped onto the low-dimensional space.

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!