We use cookies to improve your experience with our site.
Bai GR, Liu QB, He SZ et al. Unsupervised domain adaptation on sentence matching through self-supervision. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 38(6): 1237−1249 Nov. 2023. DOI: 10.1007/s11390-022-1479-0.
Citation: Bai GR, Liu QB, He SZ et al. Unsupervised domain adaptation on sentence matching through self-supervision. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 38(6): 1237−1249 Nov. 2023. DOI: 10.1007/s11390-022-1479-0.

Unsupervised Domain Adaptation on Sentence Matching Through Self-Supervision

  • Although neural approaches have yielded state-of-the-art results in the sentence matching task, their performance inevitably drops dramatically when applied to unseen domains. To tackle this cross-domain challenge, we address unsupervised domain adaptation on sentence matching, in which the goal is to have good performance on a target domain with only unlabeled target domain data as well as labeled source domain data. Specifically, we propose to perform self-supervised tasks to achieve it. Different from previous unsupervised domain adaptation methods, self-supervision can not only flexibly suit the characteristics of sentence matching with a special design, but also be much easier to optimize. When training, each self-supervised task is performed on both domains simultaneously in an easy-to-hard curriculum, which gradually brings the two domains closer together along the direction relevant to the task. As a result, the classifier trained on the source domain is able to generalize to the unlabeled target domain. In total, we present three types of self-supervised tasks and the results demonstrate their superiority. In addition, we further study the performance of different usages of self-supervised tasks, which would inspire how to effectively utilize self-supervision for cross-domain scenarios.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return