A framework for semi-supervised metric transfer learning on manifolds

Authors: Rakesh Kumar Sanodiya and Jimson Mathew

Abstract: A common assumption of statistical learning theory is that the training and testing data are drawn from the same distribution. However, in many real-world applications, this assumption does not hold true. Hence, a realistic strategy, Cross Domain Adaptation (DA) or Transfer Learning (TA), can be used to employ previously labelled source domain data to boost the task in the new target domain. Previously, Cross Domain Adaptation methods have been focused on re-weighting the instances or aligning the cross-domain distributions. However, these methods have two significant challenges: (1) There is no proper consideration of the unlabelled data of target task as in the real-world, an abundant amount of unlabelled data is available, (2) The use of normal Euclidean distance function fails to capture the appropriate similarity or dissimilarity between samples. In this work new Semi-Supervised Metric Transfer Learning framework called SSMT that reduces the distribution between domains both statistically and geometrically by learning the instance weights, while a regularized distance metric is learned to minimize the within-class co-variance and maximize the between-class co-variance simultaneously for the target domain.

Publishing Date: July ( 2019)

Published in: Knowledge Based Systems