Author
Listed:
- Theresia Ratih Dewi Saputri
- Adil Mehmood Khan
- Seok-Won Lee
Abstract
Advancement in wireless sensor networks gave birth to applications that can provide friendly and intelligent services based on the recognition of human activities. Although the technology supports monitoring activity patterns, enabling applications to recognize activities user-independently is still a main concern. Achieving this goal is tough for two reasons: firstly, different people exhibit different physical patterns for the same activity due to their different behavior. Secondly, different activities performed by the same person could have different underlying models. Therefore, it is unwise to recognize different activities using the same features. This work presents a solution to this problem. The proposed system uses simple time domain features with a single neural network and a three-stage genetic algorithm-based feature selection method for accurate user-independent activity recognition. System evaluation is carried out for six activities in a user-independent setting using 27 subjects. Recognition performance is also compared with well-known existing methods. Average accuracy of 93% in these experiments shows the feasibility of using our method for subject-independent human activity recognition.
Suggested Citation
Theresia Ratih Dewi Saputri & Adil Mehmood Khan & Seok-Won Lee, 2014.
"User-Independent Activity Recognition via Three-Stage GA-Based Feature Selection,"
International Journal of Distributed Sensor Networks, , vol. 10(3), pages 706287-7062, March.
Handle:
RePEc:sae:intdis:v:10:y:2014:i:3:p:706287
DOI: 10.1155/2014/706287
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:intdis:v:10:y:2014:i:3:p:706287. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.