Emergent Mind

Probabilistic Contrastive Learning for Domain Adaptation

(2111.06021)
Published Nov 11, 2021 in cs.CV

Abstract

Contrastive learning can largely enhance the feature discriminability in a self-supervised manner and has achieved remarkable success for various visual tasks. However, it is undesirably observed that the standard contrastive paradigm (features+$\ell{2}$ normalization) only brings little help for domain adaptation. In this work, we delve into this phenomenon and find that the main reason is due to the class weights (weights of the final fully connected layer) which are vital for the recognition yet ignored in the optimization. To tackle this issue, we propose a simple yet powerful Probabilistic Contrastive Learning (PCL), which does not only assist in extracting discriminative features but also enforces them to be clustered around the class weights. Specifically, we break the standard contrastive paradigm by removing $\ell{2}$ normalization and replacing the features with probabilities. In this way, PCL can enforce the probability to approximate the one-hot form, thereby reducing the deviation between the features and class weights. Benefiting from the conciseness, PCL can be well generalized to different settings. In this work, we conduct extensive experiments on five tasks and observe consistent performance gains, i.e., Unsupervised Domain Adaptation (UDA), Semi-Supervised Domain Adaptation (SSDA), Semi-Supervised Learning (SSL), UDA Detection, and UDA Semantic Segmentation. Notably, for UDA Semantic Segmentation on SYNTHIA, PCL surpasses the sophisticated CPSL-D by $>!2\%$ in terms of mean IoU with a much smaller training cost (PCL: 13090, 5 days v.s. CPSL-D: 4V100, 11 days). Code is available at https://github.com/ljjcoder/Probabilistic-Contrastive-Learning.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.