A domain-invariant subspace will be learned. MIDA can be applied in all kinds of domain adaptation problems, including discrete or continuous distributional change, supervised/semi-supervised/unsupervised, multiple domains, classification or regression, etc. All domains can be unlabeled/labeled/partially labeled. Suitable for transfer learning, domain adaptation, and concept drift adaptation (e.g. sensor drift correction) problems. Two test cases are in testMida.m.
ref: Ke Yan, Lu Kou, and David Zhang, "Domain Adaptation via Maximum Independence of Domain Features," http://arxiv.org/abs/1603.04535
Copyright 2016 YAN Ke, Tsinghua Univ. http://yanke23.com , email@example.com
Hi Mesho, I think since your target is ordered, you may directly use one scalar as your target (e.g., low=-1, medium=0 and high=1)
Hi Ke Yan.
Thanks a lot for your excellent work! I am planning to use your method.
I have a multi classification problem (low, medium and high) to classify the performance of users.
How can I set the target to enhance the discriminative power of the learned features?
(you said: Classification problems should use class indices as labels i.e. target(i)=j)
It was not clear to me, any advice and suggestions will be greatly appreciated.
Hello Ke Yan.
Excellent work on this submission.
Could you check if you forgot to put in the sigmoid function's definition in the files class_lr_te and class_lr_tr? I had to add the functions manually because testMIDa would not run.
add a figure