Journal
EXPERT SYSTEMS WITH APPLICATIONS
Volume 59, Issue -, Pages 235-244Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2016.04.032
Keywords
Human activity recognition; Deep learning; Convolutional neural network; Smartphone; Sensors
Categories
Funding
- MSIP (Ministry of Science, ICT and Future Planning), Korea, under the ITRC (Information Technology Research Center) support program [IITP-2016-R0992-15-1011]
- Institute for Information & Communication Technology Planning & Evaluation (IITP), Republic of Korea [R0992-16-1011] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)
Ask authors/readers for more resources
Human activities are inherently translation invariant and hierarchical. Human activity recognition (HAR), a field that has garnered a lot of attention in recent years due to its high demand in various application domains, makes use of time-series sensor data to infer activities. In this paper, a deep convolutional neural network (convnet) is proposed to perform efficient and effective HAR using smartphone sensors by exploiting the inherent characteristics of activities and 1D time-series signals, at the same time providing a way to automatically and data-adaptively extract robust features from raw data. Experiments show that convnets indeed derive relevant and more complex features with every additional layer, although difference of feature complexity level decreases with every additional layer. A wider time span of temporal local correlation can be exploited (1 x 9-1 x 14) and a low pooling size (1 x 2-1 x 3) is shown to be beneficial. Convnets also achieved an almost perfect classification on moving activities, especially very similar ones which were previously perceived to be very difficult to classify. Lastly, convnets outperform other state-of-the-art data mining techniques in HAR for the benchmark dataset collected from 30 volunteer subjects, achieving an overall performance of 94.79% on the test set with raw sensor data, and 95.75% with additional information of temporal fast Fourier transform of the HAR data set. (C) 2016 Published by Elsevier Ltd.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available