Semi-supervised Domain Adaptation via Prototype-based Multi-level Learning (2305.02693v3)
Abstract: In semi-supervised domain adaptation (SSDA), a few labeled target samples of each class help the model to transfer knowledge representation from the fully labeled source domain to the target domain. Many existing methods ignore the benefits of making full use of the labeled target samples from multi-level. To make better use of this additional data, we propose a novel Prototype-based Multi-level Learning (ProML) framework to better tap the potential of labeled target samples. To achieve intra-domain adaptation, we first introduce a pseudo-label aggregation based on the intra-domain optimal transport to help the model align the feature distribution of unlabeled target samples and the prototype. At the inter-domain level, we propose a cross-domain alignment loss to help the model use the target prototype for cross-domain knowledge transfer. We further propose a dual consistency based on prototype similarity and linear classifier to promote discriminative learning of compact target feature representation at the batch level. Extensive experiments on three datasets, including DomainNet, VisDA2017, and Office-Home demonstrate that our proposed method achieves state-of-the-art performance in SSDA.
- Joint domain alignment and discriminative feature learning for unsupervised deep domain adaptation. In Proceedings of the AAAI conference on artificial intelligence, volume 33, pages 3296–3303, 2019.
- Joint distribution optimal transportation for domain adaptation. Advances in Neural Information Processing Systems, 30, 2017.
- Randaugment: Practical automated data augmentation with a reduced search space. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops, pages 702–703, 2020.
- Unbalanced minibatch optimal transport; applications to domain adaptation. In International Conference on Machine Learning, pages 3186–3197. PMLR, 2021.
- Domain-adversarial training of neural networks. The journal of machine learning research, 17(1):2096–2030, 2016.
- Semi-supervised learning by entropy minimization. Advances in neural information processing systems, 17, 2004.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Bidirectional adversarial training for semi-supervised domain adaptation. In IJCAI, pages 934–940, 2020.
- Attract, perturb, and explore: Learning a feature alignment network for semi-supervised domain adaptation. In European conference on computer vision, pages 591–607. Springer, 2020.
- Cross-domain adaptive clustering for semi-supervised domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 2505–2514, 2021.
- Ecacl: A holistic framework for semi-supervised domain adaptation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 8578–8587, 2021.
- Deep transfer learning with joint adaptation networks. In International conference on machine learning, pages 2208–2217. PMLR, 2017.
- Conditional adversarial domain adaptation. Advances in neural information processing systems, 31, 2018.
- Domain adaptation via transfer component analysis. IEEE transactions on neural networks, 22(2):199–210, 2010.
- Multi-adversarial domain adaptation. In Thirty-second AAAI conference on artificial intelligence, 2018.
- Visda: The visual domain adaptation challenge, 2017.
- Moment matching for multi-source domain adaptation. In Proceedings of the IEEE International Conference on Computer Vision, pages 1406–1415, 2019.
- Con22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPTda: Simplifying semi-supervised domain adaptation by learning consistent and contrastive feature representations. arXiv preprint arXiv:2204.01558, 2022.
- Semi-supervised domain adaptation via minimax entropy. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 8050–8058, 2019.
- Ankit Singh. Clda: Contrastive learning for semi-supervised domain adaptation. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, volume 34, pages 5089–5101. Curran Associates, Inc., 2021.
- Fixmatch: Simplifying semi-supervised learning with consistency and confidence. Advances in neural information processing systems, 33:596–608, 2020.
- Deep coral: Correlation alignment for deep domain adaptation. In European conference on computer vision, pages 443–450. Springer, 2016.
- Laurens Van der Maaten and Geoffrey Hinton. Visualizing data using t-sne. Journal of machine learning research, 9(11), 2008.
- Deep hashing network for unsupervised domain adaptation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 5018–5027, 2017.
- Semi-supervised semantic segmentation with prototype-based consistency regularization. Advances in neural information processing systems, 2022.
- Multi-level consistency learning for semi-supervised domain adaptation. arXiv preprint arXiv:2205.04066, 2022.
- Deep co-training with task decomposition for semi-supervised domain adaptation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 8906–8916, 2021.