Journal of Computer Science and Technology ›› 2021, Vol. 36 ›› Issue (3): 606-616.doi: 10.1007/s11390-021-1106-5

Special Issue: Artificial Intelligence and Pattern Recognition

• Special Section on Learning from Small Samples • Previous Articles     Next Articles

Source-Free Unsupervised Domain Adaptation with Sample Transport Learning

Qing Tian1,2,*, Member, CCF, Chuang Ma1, Feng-Yuan Zhang1, Shun Peng1, and Hui Xue3, Member, CCF        

  1. 1 School of Computer and Software, Nanjing University of Information Science and Technology, Nanjing 210044, China;
    2 Engineering Research Center of Digital Forensics, Ministry of Education, Nanjing University of Information Science and Technology, Nanjing 210044, China;
    3 School of Computer Science and Engineering, Southeast University, Nanjing 211189, China
  • Received:2020-10-22 Revised:2021-04-22 Online:2021-05-05 Published:2021-05-31
  • Contact: Qing Tian E-mail:tianqing@nuist.edu.cn
  • About author:Qing Tian received his Ph.D. degree in computer science from Nanjing University of Aeronautics and Astronautics, Nanjing, in 2016. He is currently an associate professor in the School of Computer and Software, Nanjing University of Information Science and Technology, Nanjing. He was an academic visitor at the University of Manchester, UK, from 2018 to 2019. He is the recipient of the National Ph.D. Scholarship Award of China in 2015, the Best Scientific Paper Award of ICPR in 2016, the Excellent Doctoral Dissertation Award of Jiangsu Province of China in 2017, etc. He has served as a program committee member for several renowned international conferences, such as IJCAI, PRICAI, and IDEAL, and a reviewer for many prestigious international journals and conferences, such as IEEE TPAMI, IEEE TNNLS, IEEE TCYB, IEEE TIFS, ACM TIST, IJCAI, ICDM, and CVPR. His research interests include machine learning, pattern recognition and computer vision. He is a member of CCF.
  • Supported by:
    This work was partially supported by the National Natural Science Foundation of China under Grant Nos. 61702273 and 62076062, the Natural Science Foundation of Jinangsu Province of China under Grant No. BK20170956, the Open Projects Program of National Laboratory of Pattern Recognition under Grant No. 20200007, and was also sponsored by Qing Lan Project.

Unsupervised domain adaptation (UDA) has achieved great success in handling cross-domain machine learning applications. It typically benefits the model training of unlabeled target domain by leveraging knowledge from labeled source domain. For this purpose, the minimization of the marginal distribution divergence and conditional distribution divergence between the source and the target domain is widely adopted in existing work. Nevertheless, for the sake of privacy preservation, the source domain is usually not provided with training data but trained predictor (e.g., classifier). This incurs the above studies infeasible because the marginal and conditional distributions of the source domain are incalculable. To this end, this article proposes a source-free UDA which jointly models domain adaptation and sample transport learning, namely Sample Transport Domain Adaptation (STDA). Specifically, STDA constructs the pseudo source domain according to the aggregated decision boundaries of multiple source classifiers made on the target domain. Then, it refines the pseudo source domain by augmenting it through transporting those target samples with high confidence, and consequently generates labels for the target domain. We train the STDA model by performing domain adaptation with sample transport between the above steps in alternating manner, and eventually achieve knowledge adaptation to the target domain and attain confident labels for it. Finally, evaluation results have validated effectiveness and superiority of the proposed method.

Key words: unsupervised domain adaptation; domain shift; sample transport; pseudo source domain;

[1] Pan S J, Yang Q. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 2009, 22(10):1345-1359. DOI:10.1109/TKDE.2009.191.
[2] Yan H, Ding Y, Li P, Wang Q, Xu Y, Zuo W. Mind the class weight bias:Weighted maximum mean discrepancy for unsupervised domain adaptation. In Proc. the 2017 IEEE Conference on Computer Vision and Pattern Recognition, July 2017, pp.2272-2281. DOI:10.1109/CVPR.2017.107.
[3] Tahmoresnezhad J, Hashemi S. Visual domain adaptation via transfer feature learning. Knowledge and Information Systems, 2017, 50(2):585-605. DOI:10.1007/s10115-016-0944-x.
[4] Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V. Domainadversarial training of neural networks. The Journal of Machine Learning Research, 2016, 17(1):2096-2030. DOI:10.1007/978-3-319-58347-1_10.
[5] Ganin Y, Lempitsky V. Unsupervised domain adaptation by backpropagation. In Proc. the 32nd International Conference on Machine Learning, July 2015, pp.1180-1189.
[6] Saito K, Watanabe K, Ushiku Y, Harada T. Maximum classifier discrepancy for unsupervised domain adaptation. In Proc. the 2018 IEEE Conference on Computer Vision and Pattern Recognition, June 2018, pp.3723-3732. DOI:10.1109/CVPR.2018.00392.
[7] Baktashmotlagh M, Harandi M T, Lovell B C, Salzmann M. Unsupervised domain adaptation by domain invariant projection. In Proc. the 2013 IEEE International Conference on Computer Vision, December 2013, pp.769-776. DOI:10.1109/ICCV.2013.100.
[8] Pan Y, Yao T, Li Y, Wang Y, Ngo C W, Mei T. Transferrable prototypical networks for unsupervised domain adaptation. In Proc. the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2019, pp.2239-2247. DOI:10.1109/CVPR.2019.00234.
[9] Lee C Y, Batra T, Baig M H, Ulbricht D. Sliced wasserstein discrepancy for unsupervised domain adaptation. In Proc. the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2019, pp.10285-10295. DOI:10.1109/CVPR.2019.01053.
[10] Lee S, Kim D, Kim N, Jeong S G. Drop to adapt:Learning discriminative features for unsupervised domain adaptation. In Proc. the 2019 IEEE/CVF International Conference on Computer Vision, Oct. 27-Nov. 2, 2019, pp.91-100. DOI:10.1109/ICCV.2019.00018.
[11] Pan S J, Tsang I W, Kwok J T, Yang Q. Domain adaptation via transfer component analysis. IEEE Transactions on Neural Networks, 2010, 22(2):199-210. DOI:10.1109/TNN.2010.2091281.
[12] Wang J, Chen Y, Hao S, Feng W, Shen Z. Balanced distribution adaptation for transfer learning. In Proc. the 2017 IEEE International Conference on Data Mining, November 2017, pp.1129-1134. DOI:10.1109/ICDM.2017.150.
[13] Kononenko I. Machine learning for medical diagnosis:History, state of the art and perspective. Artificial Intelligence in Medicine, 2001, 23(1):89-109. DOI:10.1016/S0933-3657(01)00077-X.
[14] Chen F, Bruhadeshwar B, Liu A X. Cross-domain privacypreserving cooperative firewall optimization. IEEE/ACM Transactions on Networking, 2012, 21(3):857-868. DOI:10.1109/TNET.2012.2217985.
[15] Lee T, Pappas C, Barrera D, Szalachowski P, Perrig A. Source accountability with domain-brokered privacy. In Proc. the 12th International Conference on Emerging Networking Experiments and Technologies, December 2016, pp.345-358. DOI:10.1145/2999572.2999581.
[16] Zhang L, Zhang D. Domain adaptation extreme learning machines for drift compensation in E-nose systems. IEEE Transactions on Instrumentation and Measurement, 2014, 64(7):1790-1801. DOI:10.1109/TIM.2014.2367775.
[17] Kim Y, Hong S, Cho D, Park H, Panda P. Domain adaptation without source data. arXiv:2007.01524, 2020. https://arxiv.org/abs/2007.01524v2, January 2021.
[18] Long M, Cao Z, Wang J, Jordan M I. Conditional adversarial domain adaptation. In Proc. the 32nd International Conference on Neural Information Processing Systems, December 2018, pp.1640-1650.
[19] Hu J, Mo Q, Liu Z et al. Multi-source classification:A DOA-based deep learning approach. In Proc. the 2020 International Conference on Computer Engineering and Application, March 2020, pp.463-467. DOI:10.1109/ICCEA50009.2020.00106.
[20] Zhao C, Wang S, Li D. Multi-source domain adaptation with joint learning for cross-domain sentiment classification. Knowledge-Based Systems, 2020, 191:Article No. 105254. DOI:10.1016/j.knosys.2019.105254.
[21] Wang J, Feng W, Chen Y, Yu H, Huang M, Yu P S. Visual domain adaptation with manifold embedded distribution alignment. In Proc. the 26th ACM International Conference on Multimedia, October 2018, pp.402-410. DOI:10.1145/3240508.3240512.
[22] Wang J, Chen Y, Feng W, Yu H, Huang M, Yang Q. Transfer learning with dynamic distribution adaptation. ACM Transactions on Intelligent Systems and Technology, 2020, 11(1):Article No. 6. DOI:10.1145/3360309.
[23] Behrend R E, Pearce P A, Petkova V B, Zuber J B. On the classification of bulk and boundary conformal field theories. Physics Letters B, 1998, 444(1/2):163-166. DOI:10.1016/S0370-2693(98)01374-4.
[24] Wang Q, Breckon T. Unsupervised domain adaptation via structured prediction based selective pseudolabeling. In Proc. the 34th AAAI Conference on Artificial Intelligence, February 2020, pp.6243-6250. DOI:10.1609/aaai.v34i04.6091.
[25] Kundu J N, Venkat N, Babu R V et al. Universal source-free domain adaptation. In Proc. the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2020, pp.4544-4553. DOI:10.1109/CVPR42600.2020.00460.
[26] Nelakurthi A R, Maciejewski R, He J. Source free domain adaptation using an off-the-shelf classifier. In Proc. the 2018 IEEE International Conference on Big Data, December 2018, pp.140-145. DOI:10.1109/BigData.2018.8622112.
[27] Duan L, Tsang I W, Xu D, Chua T S. Domain adaptation from multiple sources via auxiliary classifiers. In Proc. the 26th International Conference on Machine Learning, June 2009, pp.289-296. DOI:10.1145/1553374.1553411.
[28] Zhang Y, Tang H, Jia K, Tan M. Domain-symmetric networks for adversarial domain adaptation. In Proc. the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2019, pp.5031-5040. DOI:10.1109/CVPR.2019.00517.
[29] Dziugaite G K, Roy D M, Ghahramani Z. Training generative neural networks via maximum mean discrepancy optimization. arXiv:1505.03906, 2015. https://arxiv.org/abs/1505.03906, January 2021.
[30] Iyer A, Nath S, Sarawagi S. Maximum mean discrepancy for class ratio estimation:Convergence bounds and kernel selection. In Proc. the 31st International Conference on Machine Learning, June 2014, pp.530-538.
[31] Li J, Zhao J, Lu K. Joint feature selection and structure preservation for domain adaptation. In Proc. the 25th International Joint Conference on Artificial Intelligence, July 2016, pp.1697-1703.
[32] Chen Y, Song S, Li S, Wu C. A graph embedding framework for maximum mean discrepancy-based domain adaptation algorithms. IEEE Transactions on Image Processing, 2019, 29:199-213. DOI:10.1109/TIP.2019.2928630.
[33] Donmez P, Carbonell J G, Schneider J. Efficiently learning the accuracy of labeling sources for selective sampling. In Proc. the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, June 2009, pp.259-268. DOI:10.1145/1557019.1557053.
[34] Liu A, Ziebart B. Robust classification under sample selection bias. In Proc. the 27th International Conference on Neural Information Processing Systems, December 2014, pp.37-45.
[35] Huang S J, Li G X, Huang W Y, Li S Y. Incremental multi-label learning with active queries. Journal of Computer Science and Technology, 2020, 35(2):234-246. DOI:10.1007/s11390-020-9994-3.
[36] Gao N, Huang S J, Yan Y, Chen S. Cross modal similarity learning with active queries. Pattern Recognition, 2018, 75:214-222. DOI:10.1016/j.patcog.2017.05.011.
[37] Gong B, Shi Y, Sha F, Grauman K. Geodesic flow kernel for unsupervised domain adaptation. In Proc. the 2012 IEEE Conference on Computer Vision and Pattern Recognition, June 2012, pp.2066-2073. DOI:10.1109/CVPR.2012.6247911.
[38] Long M, Wang J, Ding G, Sun J, Yu P S. Transfer feature learning with joint distribution adaptation. In Proc. the 2013 IEEE International Conference on Computer Vision, December 2013, pp.2200-2207. DOI:10.1109/ICCV.2013.274.
[39] Wang J, Chen Y, Yu H, Huang M, Yang Q. Easy transfer learning by exploiting intra-domain structures. In Proc. the 2019 IEEE International Conference on Multimedia and Expo, July 2019, pp.1210-1215. DOI:10.1109/ICME.2019.00211.
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] Zhou Di;. A Recovery Technique for Distributed Communicating Process Systems[J]. , 1986, 1(2): 34 -43 .
[2] Chen Shihua;. On the Structure of Finite Automata of Which M Is an(Weak)Inverse with Delay τ[J]. , 1986, 1(2): 54 -59 .
[3] Feng Yulin;. Recursive Implementation of VLSI Circuits[J]. , 1986, 1(2): 72 -82 .
[4] Liu Mingye; Hong Enyu;. Some Covering Problems and Their Solutions in Automatic Logic Synthesis Systems[J]. , 1986, 1(2): 83 -92 .
[5] Wang Jianchao; Wei Daozheng;. An Effective Test Generation Algorithm for Combinational Circuits[J]. , 1986, 1(4): 1 -16 .
[6] Chen Zhaoxiong; Gao Qingshi;. A Substitution Based Model for the Implementation of PROLOG——The Design and Implementation of LPROLOG[J]. , 1986, 1(4): 17 -26 .
[7] Huang Heyan;. A Parallel Implementation Model of HPARLOG[J]. , 1986, 1(4): 27 -38 .
[8] Zheng Guoliang; Li Hui;. The Design and Implementation of the Syntax-Directed Editor Generator(SEG)[J]. , 1986, 1(4): 39 -48 .
[9] Huang Xuedong; Cai Lianhong; Fang Ditang; Chi Bianjin; Zhou Li; Jiang Li;. A Computer System for Chinese Character Speech Input[J]. , 1986, 1(4): 75 -83 .
[10] Xu Xiaoshu;. Simplification of Multivalued Sequential SULM Network by Using Cascade Decomposition[J]. , 1986, 1(4): 84 -95 .

ISSN 1000-9000(Print)

         1860-4749(Online)
CN 11-2296/TP

Home
Editorial Board
Author Guidelines
Subscription
Journal of Computer Science and Technology
Institute of Computing Technology, Chinese Academy of Sciences
P.O. Box 2704, Beijing 100190 P.R. China
Tel.:86-10-62610746
E-mail: jcst@ict.ac.cn
 
  Copyright ©2015 JCST, All Rights Reserved