Author
Listed:
- Jianning Wu
- Jiajing Wang
- Yun Ling
Abstract
The joint reconstruction of nonsparse multi-sensors data with high quality is a challenging issue in human activity telemonitoring. In this study, we proposed a novel joint reconstruction algorithm combining distributed compressed sensing with multiple block sparse Bayesian learning. Its basic idea is that based on the joint sparsity model, the distributed compressed sensing technique is first applied to simultaneously compress the multi-sensors data for gaining the high-correlation information regarding activity as well as the energy efficiency of sensors, and then, the multiple block sparse Bayesian learning technique is employed to jointly recover nonsparse multi-sensors data with high fidelity by exploiting the joint block sparsity. The multi-sensors acceleration data from an open wearable action recognition database are selected to assess the practicality of our proposed technique. The sparse representation classification model is used to classify activity patterns using the jointly reconstructed data in order to further examine the effectiveness of our proposed method. The results showed that when compression rates are selected properly, our proposed technique can gain the best joint reconstruction performance as well as energy efficiency of sensors, which greatly contributes to the best sparse representation classification–based activity classification performance. This has a great potential for energy-efficient telemonitoring of human activity.
Suggested Citation
Jianning Wu & Jiajing Wang & Yun Ling, 2018.
"DCS-based MBSBL joint reconstruction of multi-sensors data for energy-efficient telemonitoring of human activity,"
International Journal of Distributed Sensor Networks, , vol. 14(3), pages 15501477187, March.
Handle:
RePEc:sae:intdis:v:14:y:2018:i:3:p:1550147718767612
DOI: 10.1177/1550147718767612
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:intdis:v:14:y:2018:i:3:p:1550147718767612. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.