NMIXX: Domain-Adapted Neural Embeddings for Cross-Lingual eXploration of Finance
Author
Abstract
Suggested Citation
Download full text from publisher
References listed on IDEAS
- Guijin Son & Hanwool Lee & Nahyeon Kang & Moonjeong Hahm, 2023. "Removing Non-Stationary Knowledge From Pre-Trained Language Models for Entity-Level Sentiment Classification in Finance," Papers 2301.03136, arXiv.org, revised Jan 2023.
- Nhu Khoa Nguyen & Thierry Delahaut & Emanuela Boros & Antoine Doucet & Gael Lejeune, 2023. "Contextualizing Emerging Trends in Financial News Articles," Papers 2301.11318, arXiv.org.
- Yewon Hwang & Sungbum Jung & Hanwool Lee & Sara Yu, 2025. "TWICE: What Advantages Can Low-Resource Domain-Specific Embedding Model Bring? -- A Case Study on Korea Financial Texts," Papers 2502.07131, arXiv.org, revised Apr 2025.
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Yewon Hwang & Sungbum Jung & Hanwool Lee & Sara Yu, 2025. "TWICE: What Advantages Can Low-Resource Domain-Specific Embedding Model Bring? -- A Case Study on Korea Financial Texts," Papers 2502.07131, arXiv.org, revised Apr 2025.
More about this item
NEP fields
This paper has been announced in the following NEP Reports:- NEP-CMP-2025-08-18 (Computational Economics)
- NEP-INV-2025-08-18 (Investment)
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2507.09601. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.
Printed from https://ideas.repec.org/p/arx/papers/2507.09601.html