4.7 Article

Fusing wearable and remote sensing data streams by fast incremental learning with swarm decision table for human activity recognition

Journal

INFORMATION FUSION
Volume 60, Issue -, Pages 41-64

Publisher

ELSEVIER
DOI: 10.1016/j.inffus.2020.02.001

Keywords

Kinect depth sensor; Wearable sensor; Data mining; Classification model; Feature selection

Funding

  1. RDAO/FST [MYRG2016-00069]
  2. University of Macau [MYRG2016-00069]
  3. Macau SAR government [MYRG2016-00069]
  4. FDCT Macau [FDCT/126/2014/A3]

Ask authors/readers for more resources

Human activity recognition (HAR) by machine learning finds wide applications ranging from posture monitoring for healthcare and rehabilitation to suspicious or dangerous actions detection for security surveillance. Infrared cameras such as Microsoft Kinect and wearable sensors have been the two most adopted devices for collecting data for measuring the bodily movements. These two types of sensors generally are categorized as contactless sensing and contact sensing respectively. Due to hardware limitation, each of the two sensor types has their inherent limitations. One most common problem associating with contactless sensing like Kinect is the distance and indirect angle between the camera and the subject. For wearable sensor, it is limited in recognizing complex human activities. In this paper, a novel data fusion framework is proposed for combining data which are collected from both sensors with the aim of enhancing the HAR accuracy. Kinect is able to capture details of bodily movements from complex activities, but the accuracy is dependent heavily on the angle of view; wearable sensor is relatively primitive in gathering spatial data but reliable for detecting basic movements. Fusing the data from the two sensor types enables complimenting each other by their unique strengths. In particular, a new scheme using incremental learning with decision table coupled with swarm-based feature selection is proposed in our framework for achieving fast and accurate HAR by fusing data of two sensors. Our experiment results show that HAR accuracy could be improved from 23.51% to 68.35% in a case of almost 90 degrees slanted view of Kinect sensing while a wearing sensor is used at the same time. The swarm feature selection in general is shown to enhance the HAR performance compared to standard feature selection method. The experiment results reported here contribute to the possibilities of using hybridized sensors from the machine learning perspective.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available