IDEAS home Printed from https://ideas.repec.org/a/igg/jswis0/v18y2022i1p1-20.html
   My bibliography  Save this article

Chinese Named Entity Recognition Method Combining ALBERT and a Local Adversarial Training and Adding Attention Mechanism

Author

Listed:
  • Zhang Runmei

    (Anhui Jianzhu University, China)

  • Li Lulu

    (Anhui Jianzhu University, China)

  • Yin Lei

    (AnHui JianZhu University, China)

  • Liu Jingjing

    (AnHui JianZhu University, China)

  • Xu Weiyi

    (AnHui JianZhu University, China)

  • Cao Weiwei

    (Key Laboratory of Flight Techniques and Flight Safety, China)

  • Chen Zhong

    (Anhui Jianzhu University, China)

Abstract

For Chinese NER tasks, there is very little annotation data available. To increase the dataset, improve the accuracy of Chinese NER task, and improve the model's stability, the authors propose a method to add local adversarial training to the transfer learning model and integrate the attention mechanism. The model uses ALBERT for migration pre-training and adds perturbation factors to the output matrix of the embedding layer to constitute local adversarial training. BILSTM is used to encode the shared and private features of the task, and the attention mechanism is introduced to capture the characters that focus more on the entities. Finally, the best entity annotation is obtained by CRF. Experiments are conducted on People's Daily 2004 and Tsinghua University open-source text classification datasets. The experimental results show that compared with the SOTA model, the F1 values of the proposed method in this paper are improved by 7.32 and 7.98 in the two different datasets, respectively, proving that the accuracy of the method in this paper is improved in the Chinese domain.

Suggested Citation

  • Zhang Runmei & Li Lulu & Yin Lei & Liu Jingjing & Xu Weiyi & Cao Weiwei & Chen Zhong, 2022. "Chinese Named Entity Recognition Method Combining ALBERT and a Local Adversarial Training and Adding Attention Mechanism," International Journal on Semantic Web and Information Systems (IJSWIS), IGI Global, vol. 18(1), pages 1-20, January.
  • Handle: RePEc:igg:jswis0:v:18:y:2022:i:1:p:1-20
    as

    Download full text from publisher

    File URL: http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/IJSWIS.313946
    Download Restriction: no
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jswis0:v:18:y:2022:i:1:p:1-20. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.