IDEAS home Printed from https://ideas.repec.org/a/hin/jnlmpe/6798505.html
   My bibliography  Save this article

Deep Learning Classification Model for English Translation Styles Introducing Attention Mechanism

Author

Listed:
  • Tian Zhang
  • Gengxin Sun

Abstract

Both short-distance association knowledge and long-distance interaction knowledge in the knowledge base contain rich semantics. When learning entity and relation representation in a knowledge base, if we can learn short-distance association knowledge and long-distance interaction knowledge at the same time, we can learn the representation method with rich semantic information and keep the original structure of the knowledge base. Among the knowledge contained in a large number of records, some knowledge reflects individual characteristics and can be called local knowledge; others reflect group characteristics and can be called global knowledge. Using different ways to learn local and global knowledge in the deep learning model will better reflect the difference between the two kinds of knowledge at the model level, and make the model have the ability to understand both individual characteristics and overall characteristics. Through layer-by-layer forward propagation and error back propagation algorithms, the entire network is gradually optimized in an “end-to-end†manner. This “end-to-end†approach leaves some means of introducing prior knowledge flexibly into the model. Although it can reduce the burden on researchers, this “data-driven†approach brings the shortcomings of poor interpretability of learning results and weak generalization ability. Combining the specific prior knowledge implicit in the data with the deep learning algorithm can optimize the algorithm in a targeted manner and avoid blind searching in the solution space, so as to obtain a model with better performance and wider use. To this end, this paper investigates combining prior knowledge with deep learning to design efficient algorithms to address the classification of English translation styles. This paper combines local knowledge with global knowledge and deep learning methods and proposes a memory neural network method combining local knowledge and global knowledge. By recording the local knowledge in the local memory module and simultaneously recording the global knowledge in the global memory module, the method effectively learns the latent information in a large number of records. This paper combines short-distance association knowledge with long-distance interaction knowledge and a distributed representation learning method based on deep learning and proposes a deep learning method combining short-distance association knowledge and long-distance interaction knowledge. On the IWSLT English translation task, experiments show that the method significantly improves translation quality, confirming that grammatical dependencies enhance attention by supplementing dependent grammatical information, resulting in more effective and richer context vectors that more accurately represent contextual situations. Additional experimental analysis showed that the model underwent careful parameter selection and analysis. By mining valuable long-distance interactive knowledge in the knowledge base and using it in the distributed representation learning of the knowledge base, while constraining the short-distance related knowledge and constraining the long-distance interactive knowledge, the learned knowledge can be used to effectively complete the knowledge base distributed representation for discovering new relations.

Suggested Citation

  • Tian Zhang & Gengxin Sun, 2022. "Deep Learning Classification Model for English Translation Styles Introducing Attention Mechanism," Mathematical Problems in Engineering, Hindawi, vol. 2022, pages 1-10, March.
  • Handle: RePEc:hin:jnlmpe:6798505
    DOI: 10.1155/2022/6798505
    as

    Download full text from publisher

    File URL: http://downloads.hindawi.com/journals/mpe/2022/6798505.pdf
    Download Restriction: no

    File URL: http://downloads.hindawi.com/journals/mpe/2022/6798505.xml
    Download Restriction: no

    File URL: https://libkey.io/10.1155/2022/6798505?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:jnlmpe:6798505. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.