Author
Listed:
- Trang Thanh Quynh Le
(University of St. Thomas, USA)
- Thuong-Khanh Tran
(University of Oulu, Finland)
- Manjeet Rege
(University of St. Thomas, USA)
Abstract
Facial micro-expression is a subtle and involuntary facial expression that exhibits short duration and low intensity where hidden feelings can be disclosed. The field of micro-expression analysis has been receiving substantial awareness due to its potential values in a wide variety of practical applications. A number of studies have proposed sophisticated hand-crafted feature representations in order to leverage the task of automatic micro-expression recognition. This paper employs a dynamic image computation method for feature extraction so that features can be learned on certain localized facial regions along with deep convolutional networks to identify micro-expressions presented in the extracted dynamic images. The proposed framework is simple as opposed to other existing frameworks which used complex hand-crafted feature descriptors. For performance evaluation, the framework is tested on three publicly available databases, as well as on the integrated database in which individual databases are merged into a data pool. Impressive results from the series of experimental work show that the technique is promising in recognizing micro-expressions.
Suggested Citation
Trang Thanh Quynh Le & Thuong-Khanh Tran & Manjeet Rege, 2020.
"Rank-Pooling-Based Features on Localized Regions for Automatic Micro-Expression Recognition,"
International Journal of Multimedia Data Engineering and Management (IJMDEM), IGI Global Scientific Publishing, vol. 11(4), pages 25-37, October.
Handle:
RePEc:igg:jmdem0:v:11:y:2020:i:4:p:25-37
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jmdem0:v:11:y:2020:i:4:p:25-37. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.