Author
Listed:
- C. Pabitha
(SRM Valliammai Engineering College)
- K. Revathi
(SRM Valliammai Engineering College)
- W. Gracy Theresa
(Panimalar Engineering College)
- Pornpimol Chawengsaksopark
(Shinawatra University)
- Mithileysh Sathiyanarayanan
(Shinawatra University)
Abstract
Email communication is used by around 87% of the business communities for internal and external communication. The detection of emotion in email communication along with visual analytics is a frontier laden with potential but still it has its challenges. They face challenges such as measuring effect of emotional cues, lack of capability for visualizations, integrating seamlessly into existing platforms and implementation of multimodal analysis. To overcome these challenges, this paper introduces a multimodal architecture known as EmoMAC that will help in analysing emotions in emails. By incorporating the multimodal data of the MELD dataset, it enhances the comprehensive interpretation of emotional dynamics. For textual analysis, the proposed model includes DelighT transformer with attention scaling and multi head attention incorporated into the model. For the extraction of dynamic visual features from videos, Dynamic Spatio Temporal Feature Pyramid Network with Pyramid Pooling (DSTFP) is used. Other features such as the sender-receiver relationship, the time stamps, which are obtained from the contextual schemes can easily be incorporated by the Bias Induced Sparse Hierarchical Attention Module (BiSHAM) which utilizes a bias aware attention module for feature fusion. For versatility in new task or data EmoMAC utilizes MAML algorithm for adaptability. Using Meaningful Neural Network (MNN), EmoMAC integrates text, image, and contextual data for emotion detection within emails. Rigorous evaluation accuracy of 90.10%, precision of 95.23%, recall of 91.65%, F1 score of 92.45%, and wa-F1 score of 88.47% validates EmoMAC’s efficacy in capturing emotional nuances and provides insights for visual analytics of emotions within email.
Suggested Citation
C. Pabitha & K. Revathi & W. Gracy Theresa & Pornpimol Chawengsaksopark & Mithileysh Sathiyanarayanan, 2025.
"EmoMAC: a bias-induced multimodal fusion model for emotional analysis with visualization analytics enabled through super affective computing in emails,"
Journal of Combinatorial Optimization, Springer, vol. 50(3), pages 1-48, October.
Handle:
RePEc:spr:jcomop:v:50:y:2025:i:3:d:10.1007_s10878-025-01356-6
DOI: 10.1007/s10878-025-01356-6
Download full text from publisher
As the access to this document is restricted, you may want to
for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jcomop:v:50:y:2025:i:3:d:10.1007_s10878-025-01356-6. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.