As with all synthetic data, there’s a shift from our domain to the one captured by real cameras. Although there’s no universal domain adaptation approach for every use-case, we stand on the shoulders of giants to get great results.
Adaptive Batch Normalization is a simple technique, can be easily applied to any network with batch normalization layers, and combined with all other techniques for surprisingly good results.
Revisiting Batch Normalization For Practical Domain Adaptation.
Domain-Specific Batch Normalization for Unsupervised Domain Adaptation.
Official Implementation of “Domain Specific Batch Normalization for Unsupervised Domain Adaptation (CVPR2019)”.
Adversarial domain adaptation and its modifications for particular tasks usually result in strong improvement. The downside is that it typically requires heavy pipeline modifications.
Domain-Adversarial Training of Neural Networks.
Alleviating Semantic-level Shift: A Semi-supervised Domain Adaptation Method for Semantic Segmentation.
A collection of implementations of adversarial domain adaptation algorithms.
Image-2-image translation methods coupled with
self-regularization loss allows dataset-level refinement. While these methods require additional pipeline to train,
it is completely independent and does not require modifications of the main training pipeline.
Contrastive Learning for Unpaired
Unpaired Image-to-Image Translation using
Cycle-Consistent Adversarial Networks.
Learning from Simulated and Unsupervised
Images through Adversarial Training.
Contrastive unpaired image-to-image
translation, faster and lighter training than
cyclegan (ECCV 2020, in PyTorch).
CyCADA: Cycle-Consistent Adversarial Domain Adaptation.
Code to accompany ICML 2018 paper.
Ready for a demo or want to learn more? Get in touch.