knn classifier
7 views (last 30 days)
Show older comments
I havea segmented image of a brain,i have extracted the features for that image and have stored it in stats,now i want to classify that image using knn classifier,wheter it is starting stage or middle level stage or the image is normal
in knn
c = knnclassify(sample, training, group);
what i should give in place of sample, training, group
please help
0 Comments
Answers (3)
Tom Lane
on 7 Apr 2012
You would give "sample" as the data you want to classify, "training" as the training data having known groups, and "group" as the known groups for the training data.
If you don't have training data with known groups, you could try cluster analysis instead of knn classification.
4 Comments
Tom Lane
on 10 Apr 2012
"help knnclassify" gives an example. If that's not helpful, you'll need to ask a more specific question.
Tom Lane
on 11 Apr 2012
Sorry, I just don't understand what you have and what you want. If you had an n-by-20 matrix X containing features from pre-classified tumors, a vector G of size n-by-1 giving the classification of those tumors, and another m-by-20 matrix Y containing the same features measured on new tumors, you would use knnclassify(Y,X,G). The result would be a vector of size m-by-1 containing the classifications of the tumors in Y.
5 Comments
Walter Roberson
on 17 Nov 2022
mu = [1.5,1.5]; sigma = [1,1.5;1.5,3];
That would be for a multivariate normal random distribution with two variables, each of which had mean 1.5, and sigma describes the covariances.
mu = [4,1];sigma = [1,1.5;1.5,3];
Similar to the above but the mean for the first variable is 4 and the mean for the second variable is 1
% mean computation
x = [C1X,C2X];
That just concatenates the results of the random generation together and does not calculate any means.
merlin toche
on 22 Nov 2022
thank'you sir ! please i want to know how to built dataset of knn for fault detection.
i get for example x=[4 6 7 5 8] and y=[3 7 8 5 8]; respectively the craracteristics of current and voltage as income data and take outcome(pass or fail) in one array and z=(x,y)=(2,4) as data test
c=['open,'short','short','openl','open'];
how to buit this dataset and train it. thanks for your help
Tom Lane
on 12 Apr 2012
In your X,G values you specified that certain rows of X were in the 0, 1, or 2 stage. The "class" output is that same classification for the rows of Y. Try this:
gscatter(X(:,1),X(:,2),G,'rgb','o')
hold on; gscatter(Y(:,1),Y(:,2),class,'rgb','x'); hold off
axis equal
You can see that the Y values ('x') are the same color as the closest X values, indicating that they were classified to the same groups.
I'm not sure we are communicating properly, since either your "class" values are the ones you asked for, or I'm just not understanding. If this is not enough information for you, you'll need to describe specifically what data you have, what you want to do, what you tried, how you interpret it, what is missing, etc.
18 Comments
Image Analyst
on 24 Nov 2022
I'm willing to help but we need to explain and understand things first. Let's say, as you did, that your data is
xTraining = [4 6 7 5 8];
yTraining = [3 7 8 5 8];
% Now you say your classes are c=['open,'short','short','open','open']
% so let's make those class numbers.
trainingClasses = [1,2,2,1,1];
% Now let's plot them. First plot class 1 in blue.
plot(xTraining(trainingClasses == 1), yTraining(trainingClasses == 1), 'b.', 'MarkerSize', 30);
hold on
% Next plot class 2 in cyan.
plot(xTraining(trainingClasses == 2), yTraining(trainingClasses == 2), 'c.', 'MarkerSize', 30);
grid on;
xlabel('x');
ylabel('y');
% Now let's plot your unknown data point that we want to classify
zx = 2;
zy = 4;
plot(zx, zy, 'rs', 'LineWidth', 3, 'MarkerSize', 15);
legend('Training Class 1', 'Training Class 2', 'Unknown', 'Location', 'Northwest')
xy = [xTraining(:), yTraining(:)];
% Make model.
MdlKDT = KDTreeSearcher(xy)
% Use model to predict unknown data.
closestIndex = knnsearch(MdlKDT, [zx, zy])
estimatedClassNumber = trainingClasses(closestIndex)
OK, look at the plot, at the blue and cyan dots. Those are your training points and if you're going to use KNN you're going to have to have at least two classes, in other words two clusters of points. You don't have any. There is nothing that really groups those 4 dots into two classes/clusters. I mean you did, but your classifications don't seem to make sense since your dark blue dots are not really near each other. That's OK though and that is one of the reasons to use KNN -- it doesn't require your clusters to be grouped nearby each other. It will work even for seemingly crazy, nonsensical classifications. Ok now look at your unknown point (the red square) - it doesn't really look like it belongs to either cluster since it's not really near the training points. That's why I say you should have hundreds of training points, and you should have some apriori knowledge of what cluster/class they truly are. But knn will say that it's closer to class 1 (dark blue spots) than class 2 (cyan spots) and assign it to class 1. I hope that explains it better.
merlin toche
on 28 Nov 2022
Hi mister!
I come back to you for a detail. please let me calculate euclidean distance from several test data. how to proceed?
example, for a set of data split into training and test data
x1_train=[4 7 5 6 2;3 1 2.5 8.1 9;7.2 4.5 4 7 3]
y1_train=[1 4 5 3.2 1.5;6 1 5.3 4.7 8;0.9 11 15 3.6 9.8]
x1_test=[10 11.5 10.2 12 13.1; 1.3 2.4 5.9 3.4 6.4; 7.2 16 19 17 3.8 ]
y1_test=[1.6 8.2 4 6.2 10;6.1 2 6 5.3 9.4;1.9 0.5 13.5 8.4 14]
I want to calculate euclidean distance.
Another question, I wrote a code to partition my data using cvpartition, but in maltlab, I don't know how to train it in classifierlearner. sorry for my questions, I'm still learning please.
thank you sir for your help.
Cordially!
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
