Download PDFOpen PDF in browser

Joint Self-Attention and Multi-Embeddings for Chinese Named Entity Recognition

EasyChair Preprint no. 3340

5 pagesDate: May 6, 2020

Abstract

Named Entity Recognition (NER) is a fundamental task in Natural Language Processing (NLP), but it remains more challenging in Chinese due to the particularity and complexity of Chinese. Traditional Chinese Named Entity Recognition (Chinese NER) methods require cumbersome feature engineering and domain-specific knowledge to achieve high performance. In this paper, we propose a simple yet effective neural network framework for Chinese NER, named A-NER. A-NER is the first Bidirectional Gated Recurrent Unit - Conditional Random Field (BiGRU-CRF) model that combines self-attention mechanism with multi-embeddings technology.  It can extract richer linguistic information of characters from different granularities (e.g., radical, character, word) and find the correlations between characters in the sequence. Moreover, A-NER does not rely on any external resources and hand-crafted features. The experimental results show that our model outperforms (or approaches) existing state-of-the-art methods on different domain datasets.

Keyphrases: BiGRU-CRF, Chinese NER, Multi-Embeddings, self-attention

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:3340,
  author = {Cijian Song and Yan Xiong and Wenchao Huang and Lu Ma},
  title = {Joint Self-Attention and Multi-Embeddings for Chinese Named Entity Recognition},
  howpublished = {EasyChair Preprint no. 3340},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser