18th AIAI 2022, 17 - 20 June 2022, Greece

When Domain Adaptation Meets Semi-Supervised Learning Through Optimal Transport

Mourad El Hamri, Younès Bennani, Issam Falih

Abstract:

  This paper deals with the problem of unsupervised domain adaptation that aims to learn a classifier with a slight target risk while labeled samples are only available in the source domain. The proposed approach, called DA-SSL (Domain Adaptation meets Semi-Supervised Learning) attempts to find a joint subspace of the source and target domains using Linear Discriminant Analysis, such that the projections of the data into this latent subspace can be both domain invariant and discriminative. This aim, however, can be rather difficult to accomplish because of the missing labeled data in the target domain. To defeat this challenge, we use an incremental semi-supervised approach based on optimal transport theory, that conducts selective pseudo-labeling for unlabeled target instances. The selected pseudo-labeled target data are then combined with the source data to incrementally learn a robust classifier in a self-training fashion after the subspace alignment. Experiments show the competitiveness of the proposed approach over contemporary state-of-the-art methods on two benchmark domain adaptation datasets. We make our code publicly available.  

*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.