We use cookies to improve your experience with our site.
Cheng-Qing Zong, Hans Uszkoreit. Preface[J]. Journal of Computer Science and Technology, 2011, 26(1): 1-2. DOI: 10.1007/s11390-011-1105-z
Citation: Cheng-Qing Zong, Hans Uszkoreit. Preface[J]. Journal of Computer Science and Technology, 2011, 26(1): 1-2. DOI: 10.1007/s11390-011-1105-z
  •   Natural Language Processing (NLP) is a field of computer science and linguistics concerning the interactions between computers and human (natural) languages. There have been sufficient successes in the past decades in this area, which suggest that NLP is now and will continue to be a major area of computer sciences and information technologies.
      The goal of this special section is to present high-quality contributions that explicate reasoning involved in different areas of NLP both at theoretical and practical levels. The special section has received enthusiastic responses. We totally received 55 submissions. After careful review, we have accepted 8 papers, which have high technical quality and cover a wide range of topics that reflect new trends in NLP.
      The paper "A New Multiword Expression Metric and Its Applications" by Fan Bu et al. proposes a knowledge- free, unsupervised, and language-independent Multiword Expression Distance (MED) to measure the distance from an n-gram to its semantics and applies it to two NLP applications.
      The paper "Chinese New Word Identification: A Latent Discriminative Model with Global Features" by Xiao Sun et al. presents a piece of work that makes use of the Latent Dynamic CRF and semi-CRF model for Chinese new word detection and POS tagging as a combined task.
      The paper "Multi-Domain Sentiment Classification with Classifier Combination" by Shou-Shan Li et al. proposes a multiple classifier combination approach for the issue of multi-domain sentiment classification. They first train single domain classifiers separately with domain specific data and then combine the classifiers for the final decision.
      The paper "Learning Noun Phrase Anaphoricity in Coreference Resolution via Label Propagation" by Guo-Dong Zhou and Fang Kong introduces a method that incorporates a label-propagation algorithm into the task of noun phrase anaphoricity determination.
      The paper "Kernel-Based Semantic Relation Detection and Classification via Enriched Parse Tree Structure" by Guo-Dong Zhou and Qiao-Ming Zhu proposes a new kernel-based method for semantic relation detection and classification by making the convolution tree kernel sensitive to context and adding latent semantic information to the parse tree.
      The paper "Improvement of Machine Translation Evaluation by Simpler Linguistically Motivated Features" by Mu-Yun Yang et al. presents a machine translation evaluation metric using features involving POS tags and parser analyses in the framework of regression SVM.
      The paper "Using Syntactic-Based Kernels for Classifying Temporal Relations" by Seyed Abolghasem Mir- roshandel et al. proposes a number of novel kernels which extended the tree kernel to handle the information of events and times in a sentence for the task of temporal relation classification.
      The paper "Transfer Learning via Multi-View Principal Component Analysis" by Yang-Sheng Ji et al. aims at addressing the defeat of existing transfer learning approach, and by treating the common features in source and target as two separated views, presents a novel multi-view PCA algorithm to learn the latent representations of these two views.
      We believe this special issue will help encourage the NLP community to address the challenges in NLP and think about the problem from a broader point of view. We hope you find this special issue well worth the effort.
    We thank all the authors who submitted papers for their contributions and our dedicated reviewers for their professional reviewing services. We are grateful to Mr. Rui Xia for his great help with the review process. The guest editors sincerely hope that the readers will enjoy reading this special section and greatly benefit from the works.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return