IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i11p2412-d1153331.html
   My bibliography  Save this article

Uniting Multi-Scale Local Feature Awareness and the Self-Attention Mechanism for Named Entity Recognition

Author

Listed:
  • Lin Shi

    (Hebei Key Laboratory of Industrial Intelligent Perception, College of Artificial Intelligence, North China University of Science and Technology, Tangshan 063210, China)

  • Xianming Zou

    (Hebei Key Laboratory of Industrial Intelligent Perception, College of Artificial Intelligence, North China University of Science and Technology, Tangshan 063210, China)

  • Chenxu Dai

    (Hebei Key Laboratory of Industrial Intelligent Perception, College of Artificial Intelligence, North China University of Science and Technology, Tangshan 063210, China)

  • Zhanlin Ji

    (Hebei Key Laboratory of Industrial Intelligent Perception, College of Artificial Intelligence, North China University of Science and Technology, Tangshan 063210, China
    Telecommunications Research Centre (TRC), University of Limerick, V94 T9PX Limerick, Ireland)

Abstract

In recent years, a huge amount of text information requires processing to support the diagnosis and treatment of diabetes in the medical field; therefore, the named entity recognition of diabetes (DNER) is giving rise to the popularity of this research topic within this particular field. Although the mainstream methods for Chinese medical named entity recognition can effectively capture global context information, they ignore the potential local information in sentences, and hence cannot extract the local context features through an efficient framework. To overcome these challenges, this paper constructs a diabetes corpus and proposes the RMBC (RoBERTa Multi-scale CNN BiGRU Self-attention CRF) model. This model is a named entity recognition model that unites multi-scale local feature awareness and the self-attention mechanism. This paper first utilizes RoBERTa-wwm to encode the characters; then, it designs a local context-wise module, which captures the context information containing locally important features by fusing multi-window attention with residual convolution at the multi-scale and adds a self-attention mechanism to address the restriction of the bidirectional gated recurrent unit (BiGRU) capturing long-distance dependencies and to obtain global semantic information. Finally, conditional random fields (CRF) are relied on to learn of the dependency between adjacent tags and to obtain the optimal tag sequence. The experimental results on our constructed private dataset, termed DNER, along with two benchmark datasets, demonstrate the effectiveness of the model in this paper.

Suggested Citation

  • Lin Shi & Xianming Zou & Chenxu Dai & Zhanlin Ji, 2023. "Uniting Multi-Scale Local Feature Awareness and the Self-Attention Mechanism for Named Entity Recognition," Mathematics, MDPI, vol. 11(11), pages 1-17, May.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:11:p:2412-:d:1153331
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/11/2412/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/11/2412/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:11:p:2412-:d:1153331. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.