Author
Listed:
- Samar Fathy
(Faculty of Computers and Information, Helwan University, Helwan, Egypt)
- Nahla El-Haggar
(Faculty of Computers and Information, Helwan University, Helwan, Egypt)
- Mohamed H. Haggag
(Faculty of Computers and Information, Helwan University, Helwan, Egypt)
Abstract
Emotions can be judged by a combination of cues such as speech facial expressions and actions. Emotions are also articulated by text. This paper shows a new hybrid model for detecting emotion from text which depends on ontology with keywords semantic similarity. The text labelled with one of the six basic Ekman emotion categories. The main idea is to extract ontology from input sentences and match it with the ontology base which created from simple ontologies and the emotion of each ontology. The ontology extracted from the input sentence by using a triplet (subject, predicate, and object) extraction algorithm, then the ontology matching process is applied with the ontology base. After that the emotion of the input sentence is the emotion of the ontology which it matches with the highest score of matching. If the extracted ontology doesn't match with any ontology from the ontology base, then the keyword semantic similarity approach used. The suggested approach depends on the meaning of each sentence, the syntax and semantic analysis of the context.
Suggested Citation
Samar Fathy & Nahla El-Haggar & Mohamed H. Haggag, 2017.
"A Hybrid Model for Emotion Detection from Text,"
International Journal of Information Retrieval Research (IJIRR), IGI Global, vol. 7(1), pages 32-48, January.
Handle:
RePEc:igg:jirr00:v:7:y:2017:i:1:p:32-48
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jirr00:v:7:y:2017:i:1:p:32-48. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.