The Open Bioinformatics Journal




ISSN: 1875-0362 ― Volume 12, 2019
RESEARCH ARTICLE

Android Library for Recognition of Activities of Daily Living: Implementation Considerations, Challenges, and Solutions



Ivan Miguel Pires1, 2, 3, *, Maria Canavarro Teixeira4, 5, Nuno Pombo1, Nuno M. Garcia1, Francisco Flórez-Revuelta6, Susanna Spinsante7, Rossitza Goleva8, Eftim Zdravevski9
1 Instituto de Telecomunicações, Universidade da Beira Interior, Covilhã, Portugal
2 Altran portugal, Lisbon, Portugal
3 ALLab - Assisted Living Computing and Telecommunications Laboratory, Computing Science Department, Universidade da Beira Interior, Covilhã, Portugal
4 UTC de Recursos Naturais e Desenvolvimento Sustentável, Polytechnique Institute of Castelo Branco, Castelo Branco, Portugal
5 CERNAS - Research Centre for Natural Resources, Environment and Society, Polytechnique Institute of Castelo Branco, Castelo Branco, Portugal
6 Department of Computer Technology, Universidad de Alicante, Alicante, Spain
7 Department of Information Engineering, Marche Polytechnic University, Ancona, Italy
8 Department of Informatics, New Bulgarian University, Sofia, Bulgaria
9 Faculty of Computer Science and Engineering, University Ss Cyril and Methodius, Skopje, Macedonia

Abstract

Background:

Off-the-shelf-mobile devices have several sensors available onboard that may be used for the recognition of Activities of Daily Living (ADL) and the environments where they are performed. This research is focused on the development of Ambient Assisted Living (AAL) systems, using mobile devices for the acquisition of the different types of data related to the physical and physiological conditions of the subjects and the environments. Mobile devices with the Android Operating Systems are the least expensive and exhibit the biggest market while providing a variety of models and onboard sensors.

Objective:

This paper describes the implementation considerations, challenges and solutions about a framework for the recognition of ADL and the environments, provided as an Android library. The framework is a function of the number of sensors available in different mobile devices and utilizes a variety of activity recognition algorithms to provide a rapid feedback to the user.

Methods:

The Android library includes data fusion, data processing, features engineering and classification methods. The sensors that may be used are the accelerometer, the gyroscope, the magnetometer, the Global Positioning System (GPS) receiver and the microphone. The data processing includes the application of data cleaning methods and the extraction of features, which are used with Deep Neural Networks (DNN) for the classification of ADL and environment. Throughout this work, the limitations of the mobile devices were explored and their effects have been minimized.

Results:

The implementation of the Android library reported an overall accuracy between 58.02% and 89.15%, depending on the number of sensors used and the number of ADL and environments recognized. Compared with the results available in the literature, the performance of the library reported a mean improvement of 2.93%, and they do not differ at the maximum found in prior work, that based on the Student’s t-test.

Conclusion:

This study proves that ADL like walking, going upstairs and downstairs, running, watching TV, driving, sleeping and standing activities, and the bedroom, cooking/kitchen, gym, classroom, hall, living room, bar, library and street environments may be recognized with the sensors available in off-the-shelf mobile devices. Finally, these results may act as a preliminary research for the development of a personal digital life coach with a multi-sensor mobile device commonly used daily.

Keywords: Activities of daily living, Sensors, Mobile devices, Pattern recognition, Data fusion, Android library, Artificial neural networks, Recognition.


Article Information


Identifiers and Pagination:

Year: 2018
Volume: 11
First Page: 61
Last Page: 88
Publisher Id: TOBIOIJ-11-61
DOI: 10.2174/1875036201811010061

Article History:

Received Date: 6/2/2018
Revision Received Date: 12/4/2018
Acceptance Date: 30/4/2018
Electronic publication date: 23/05/2018
Collection year: 2018

Article Metrics:

CrossRef Citations:
0

Total Statistics:

Full-Text HTML Views: 1663
Abstract HTML Views: 1308
PDF Downloads: 541
ePub Downloads: 362
Total Views/Downloads: 3874

Unique Statistics:

Full-Text HTML Views: 966
Abstract HTML Views: 669
PDF Downloads: 305
ePub Downloads: 223
Total Views/Downloads: 2163
Geographical View

© 2018 Pires et al.

open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: (https://creativecommons.org/licenses/by/4.0/legalcode). This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.


* Address correspondence to this author at the Computer Science Department, Universidade da Beira Interior, Rua Marquês d’Ávila e Bolama, 6201-001 Covilhã, Portugal; Tel: +351 966379785; E-mail: impires@it.ubi.pt





1. INTRODUCTION

1.1. Background

Mobile devices like smartphones commonly used in daily life can be classified as multi-sensor devices [1Salazar LHA, Lacerda T, Nunes JV, Gresse von Wangenheim C. A systematic literature review on usability heuristics for mobile phones. Int J Mobile Hum Comput Interact 2013; 5: 50-61.
[http://dx.doi.org/10.4018/jmhci.2013040103]
], because they host several sensors, e.g., accelerometer, gyroscope, magnetometer, microphone, Global Positioning System (GPS) receiver and camera, that allow the acquisition of different types of data, including some of the physical and physiological data pertaining to the user, thus allowing for the recognition of the Activities of Daily Living (ADL) [2Foti D, Koketsu JS. Activities of daily living. Pedretti’s occupational therapy: Practical skills for physical dysfunction 2013; 7: 157-232., 3Trajkovik V, Vlahu-Gjorgievska E, Koceski S, Kulev I. General assisted living system architecture model International Conference on Mobile Networks and Management 2014; 329-43.].

The development of Ambient Assisted Living (AAL) systems has many challenges [4Siegel C, Hochgatterer A, Dorner TE. Contributions of ambient assisted living for health and quality of life in the elderly and care services--a qualitative analysis from the experts’ perspective of care service professionals. BMC Geriatr 2014; 14: 112.
[http://dx.doi.org/10.1186/1471-2318-14-112] [PMID: 25326149]
-8Sousa P, Sabugueiro D, Felizardo V, Couto R, Pires I, Garcia N. mHealth Sensors and Applications for Personal Aid Mobile Health 2015; 265-81.
[http://dx.doi.org/10.1007/978-3-319-12817-7_12]
], such as timely and accurate recognition of ADL [9Zdravevski E, Lameski P, Trajkovik V, Kulakov A, Chorbev I, Goleva R, et al. Improving Activity Recognition Accuracy in Ambient-Assisted Living Systems by Automated Feature Engineering. IEEE Access 2017; 5: 5262-80.
[http://dx.doi.org/10.1109/ACCESS.2017.2684913]
, 10Cippitelli E, Gasparrini S, Gambi E, Spinsante S, Wåhslény J, Orhany I, et al. Time synchronization and data fusion for RGB-depth cameras and inertial sensors in AAL applications Communication Workshop (ICCW) 2015; 265-70.
[http://dx.doi.org/10.1109/ICCW.2015.7247189]
], designing architectures that are applicable for Enhanced Living Environments [11Dobre C, Mavromoustakis Cx, Garcia N, Goleva RI, Mastorakis G. Ambient Assisted Living and Enhanced Living Environments: Principles, Technologies and Control 2016., 12Goleva R I, Garcia N M, Mavromoustakis C X, Dobre C, Mastorakis G, Stainov R, et al. AAL and ELE Platform Architecture 2017.
[http://dx.doi.org/10.1016/B978-0-12-805195-5.00008-9]
], providing reliable systems and integration with other systems [13Augusto JC, Huch M, Kameas A, et al. Eds. Handbook of Ambient Assisted Living: Technology for Healthcare, Rehabilitation and Well-being, Amsterdam, Netherlands: IOS Press 2012.]. ADL recognition is also important to design a personal digital life coach [14Garcia NM. A Roadmap to the Design of a Personal Digital Life Coach 2016.
[http://dx.doi.org/10.1007/978-3-319-25733-4_3]
]. The development of these systems is specially important to support the autonomy of older users, patients with chronic diseases and users that may have some type of disability [15Kara M, Lamouchi O, Ramdane-Cherif A. A Quality Model for the Evaluation AAL Systems 2017.
[http://dx.doi.org/10.1016/j.procs.2017.08.354]
, 16Spinsante S, Gambi E, Raffaeli L, Montanini L, Paciello L, Bevilacqua R, et al. Technology-based assistance of people with dementia: state of the art, open challenges, and future developments. Human Monitoring, Smart Health and Assisted Living: Techniques and Technologies 2017; 9: 55.]. However, these systems may be useful for everyone, including athletes and young users, as the proposed framework can be integrated into a tool for the monitoring and training of lifestyles [14Garcia NM. A Roadmap to the Design of a Personal Digital Life Coach 2016.
[http://dx.doi.org/10.1007/978-3-319-25733-4_3]
].

1.2. Motivation

The recognition of ADL and the environment where the subject is at a given moment is widely important for the creation of systems aimed at several purposes, such as the personal coaching, the health monitoring, the lifestyle monitoring and others. Nowadays, such recognition is commonly performed by complex and expensive systems with high power processing and memory capabilities [4Siegel C, Hochgatterer A, Dorner TE. Contributions of ambient assisted living for health and quality of life in the elderly and care services--a qualitative analysis from the experts’ perspective of care service professionals. BMC Geriatr 2014; 14: 112.
[http://dx.doi.org/10.1186/1471-2318-14-112] [PMID: 25326149]
-6Garcia NM, Rodrigues JJPC, Elias DC, Dias MS. Ambient Assisted Living 2014., 11Dobre C, Mavromoustakis Cx, Garcia N, Goleva RI, Mastorakis G. Ambient Assisted Living and Enhanced Living Environments: Principles, Technologies and Control 2016.-13Augusto JC, Huch M, Kameas A, et al. Eds. Handbook of Ambient Assisted Living: Technology for Healthcare, Rehabilitation and Well-being, Amsterdam, Netherlands: IOS Press 2012.], but it could be performed with commonly used equipment based on local processing techniques, in order to provide a feedback related to the ADL performed and environment frequented in almost real-time.

Our motivation is to solve this problem with the development of an Android library that performs the recognition of ADL and the associated environments, using only the data acquired from the sensors available onboard off-the-shelf mobile devices (i.e., smartphones), and local processing. We selected the Android platform for the development of the library, because this is the most used open platform available in the market [17M. 2021. (2018, Januray 2nd). Smartphone OS market share forecast 2014-2021 | Statistic. 2018. Available: https://www.statista.com/ statistics/272307/ market-share-forecast-for-smartphone-operating-systems/], and the architecture of an Android library for the recognition of ADL and environments has been already proposed in [18Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices. Sensors (Basel) 2016; 16(2): 184.
[http://dx.doi.org/10.3390/s16020184] [PMID: 26848664]
-20Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. Identification of Activities of Daily Living Using Sensors Available in off-the-shelf Mobile Devices: Research and Hypothesis 2016.
[http://dx.doi.org/10.1007/978-3-319-40114-0_14]
].

The use of the off-the-shelf mobile devices for this purpose requires the development of lightweight processing techniques. In fact, despite recent developments and improvements, these devices usually have several limitations, including reduced processing capabilities, low capacity batteries and small internal storage space.

However, based on previous studies [21Ivascu T, Cincar K, Dinis A, Negru V. Activities of daily living and falls recognition and classification from the wearable sensors data 2017.
[http://dx.doi.org/10.1109/EHB.2017.7995502]
-25Vallabh P, Malekian R, Ye N, Bogatinoska DC. Fall detection using machine learning algorithms 2016.
[http://dx.doi.org/10.1109/SOFTCOM.2016.7772142]
], we confirm that the use of the sensors available in off-the-shelf mobile devices may fuel the development of ADL recognition systems, having the potential to improve the quality of life of its users. With this study, we present a real-time mobile application that reliably identifies ADLs, the environment and location, providing feedback to the user. The recognition of the ADL featuring the subject’s movement is based on the data acquired from the motion and/or magnetic sensors (i.e., accelerometer, gyroscope and magnetometer), but the recognition of the environment is based on acoustic data acquired from the microphone. In order to support the better recognition of an ADL when the user is standing still, the fusion of the features extracted from the data acquired from motion and magnetic sensors with the environment previously recognized and the features extracted from the data acquired from the GPS receiver is performed.

1.3. Prior Work

To date, and based on the literature, there are several studies using different subsets of sensors available on smartphones. Yet, studies reporting methods that rely on a larger set of sensors available in off-the-shelf mobile devices for the recognition of ADL and their environments were not found. For the purpose of comparison with this research, we analyzed 36 studies available in the literature that use Artificial Neural Networks (ANN) for the recognition of ADL and environments. The implementation of ANN with mobile devices is widely studied and several frameworks have been developed. In this case, other types of methods have been already used in the literature, including the neural networks and the ensemble learning methods [26Pombo N, Garcia N, Bousson K. Classification techniques on computerized systems to predict and/or to detect Apnea: A systematic review. Comput Methods Programs Biomed 2017; 140: 265-74.
[http://dx.doi.org/10.1016/j.cmpb.2017.01.001] [PMID: 28254083]
], but the ensemble learning methods often include a combination of methods (thus requiring higher computational resources) and can be used with larger datasets than the neural networks methods, which will be the case for the datasets available in the storage of mobile devices [27Drucker H, Cortes C, Jackel LD, LeCun Y, Vapnik V. Boosting and Other Ensemble Methods. Neural Comput 1994; 6: 1289-301.
[http://dx.doi.org/10.1162/neco.1994.6.6.1289]
].

The use of the ANN, with variants, to recognize different types of ADL involving the subject’s movements, is quite well documented in the literature. Several works, as the following discussion shows, addressed this method, providing different accuracy performance on different datasets. There are several studies using only the accelerometer sensor, where the authors of [28Lorenzi P, Rao R, Romano G, Kita A, Irrera F. Mobile Devices for the Real-Time Detection of Specific Human Motion Disorders. IEEE Sens J 2016; 16: 8220-7.
[http://dx.doi.org/10.1109/Jsen.2016.2530944]
] recognized the walking and standing activities using Multilayer Perceptron (MLP) with a reported accuracy around 95%. Bayat et al. [29Bayat A, Pomplun M, Tran DA. A Study on Human Activity Recognition Using Accelerometer Data from Smartphones 9th International Conference on Future Networks and Communications (Fnc’14) / the 11th International Conference on Mobile Systems and Pervasive Computing (Mobispc’14) / Affiliated Workshops vol. 34: 450-7.2014;
[http://dx.doi.org/10.1016/j.procs.2014.07.009]
] implemented the MLP method using features extracted from the accelerometer data for the recognition of running, walking, dancing, going upstairs and going downstairs, reporting an accuracy of 89.72%. In [30Khalifa S, Hassan M, Seneviratne A. Feature selection for floor-changing activity recognition in multi-floor pedestrian navigation 2014.
[http://dx.doi.org/10.1109/ICMU.2014.6799049]
], the authors used the accelerometer data and implemented the MLP method for the recognition of going upstairs, going up on an escalator and walking, reporting an average accuracy of 94%. The study of [31Piyare R, Lee SR. Mobile Sensing Platform for Personal Health Management 18th Ieee International Symposium on Consumer Electronics (Isce 2014) 1-2.2014;
[http://dx.doi.org/10.1109/ISCE.2014.6884300]
] implemented the MLP method for the recognition of several activities, such as walking, jogging, going upstairs, going downstairs, sitting, standing and lying, reporting a maximum accuracy of 94.44%. The standing, walking and running activities reported a minimum accuracy of 89% using the MLP method [32Kwon Y, Kang K, Bae C. Analysis and Evaluation of Smartphone-based Human Activity Recognition Using a Neural Network Approach. 2015 International Joint Conference on Neural Networks (IJCNN) 1-5.2015;
[http://dx.doi.org/10.1109/IJCNN.2015.7280494]
]. The authors of [33Wang C, Xu Y, Zhang J, Yu W. SW-HMM: A Method for Evaluating Confidence of Smartphone-Based Activity Recognition 2016.
[http://dx.doi.org/10.1109/TrustCom.2016.0320]
] implemented the MLP method for the comparison of the results obtained with the Sliding-Window-based Hidden Markov Model (SW-HMM), reporting an accuracy always higher than 80% in the recognition of walking, standing, running, going downstairs and going upstairs. MLP method was implemented by the authors of [34Hsu Y-W, Chen K-H, Yang J-J, Jaw F-S. Smartphone-based fall detection algorithm using feature extraction 2016.
[http://dx.doi.org/10.1109/CISP-BMEI.2016.7852959]
] for the recognition of standing, sitting, walking, lying and falling activities, reporting an accuracy higher than 90%. Using the same method, in [35Vavoulas G, Chatzaki C, Malliotakis T, Pediaditis M, Tsiknakis M. The MobiAct Dataset: Recognition of Activities of Daily Living using Smartphones Proceedings of the International Conference on Information and Communication Technologies for Ageing Well and E-Health (Ict4awe) 143-51.2016;
[http://dx.doi.org/10.5220/0005792401430151]
], the authors reported that the recognition of the standing, walking, jogging, jumping, going upstairs, going downstairs and sitting activities has an accuracy higher than 90%. In [36Bujari A, Licar B, Palazzi CE. Movement Pattern Recognition through Smartphone’s Accelerometer, 2012 IEEE Consumer Communications and Networking Conference 2012; 502-6.], the walking and standing activities reported a recognition accuracy between 75% and 98% using the MLP method. The MLP method was used by the authors of [37Torres-Huitzil C, Nuno-Maganda M. Robust smartphone-based human activity recognition using a tri-axial accelerometer 2015.
[http://dx.doi.org/10.1109/LASCAS.2015.7250435]
] for the recognition of standing, walking, going upstairs, going downstairs and running activities, reporting an accuracy of 92%. Cycling, standing, walking, jogging, driving and running are other activities recognized by the authors of [38Bajpai A, Jilla V, Tiwari VN, Venkatesan SM, Narayanan R. Quantifiable fitness tracking using wearable devices. Conf Proc IEEE Eng Med Biol Soc 2015; 2015: 1633-7.
[http://dx.doi.org/10.1109/EMBC.2015.7318688] [PMID: 26736588]
] with the MLP method, reporting an accuracy between 57.53% and 97.58%. In [39Zainudin MNS, Sulaiman MN, Mustapha N, Perumal T. Activity Recognition based on Accelerometer Sensor using Combinational Classifiers 2015 IEEE Conference on Open Systems (Icos) 68-73.2015;
[http://dx.doi.org/10.1109/ICOS.2015.7377280]
], the MLP method was used for the recognition of walking, going upstairs, going downstairs, standing, sitting and jogging activities, reporting an accuracy around 97%. In order to identify the activities performed during a football game, the authors of [40Sen S, Rachuri K K, Mukherji A, Misra A. Did you take a break today? Detecting playing foosball using your smartwatch 2016.
[http://dx.doi.org/10.1109/PERCOMW.2016.7457165]
] used the MLP method with a reported accuracy higher than 95%. In [41Kwapisz JR, Weiss GM, Moore SA. Activity recognition using cell phone accelerometers. SIGKDD Explor 2011; 12: 74.
[http://dx.doi.org/10.1145/1964897.1964918]
], the walking, going upstairs, jogging, going downstairs, standing and sitting activities were recognized with the MLP method, which reported an accuracy of 97.7%. The study [42Mitchell E, Monaghan D, O’Connor NE. Classification of sporting activities using smartphone accelerometers. Sensors (Basel) 2013; 13(4): 5317-37.
[http://dx.doi.org/10.3390/s130405317] [PMID: 23604031]
] presented the implementation of the MLP method for the recognition of walking, jogging, going upstairs, going downstairs, sitting and standing activities, reported an average accuracy of 82.3%. In [43Li P, Wang Y, Tian Y, Zhou TS, Li JS. An Automatic User-Adapted Physical Activity Classification Method Using Smartphones. IEEE Trans Biomed Eng 2017; 64(3): 706-14.
[http://dx.doi.org/10.1109/TBME.2016.2573045] [PMID: 27249822]
], the MLP method was used for the recognition of cycling, running, sitting, standing and walking with a minimum reported accuracy of 61%. In addition to the use of the MLP method with the accelerometer data as input, other methods are growing, such as DNN and Probabilistic Neural Networks (PNN) methods. DNN method was used by the authors of [44Zhang L, Wu X, Luo D. Real-Time Activity Recognition on Smartphones Using Deep Neural Networks 2015.
[http://dx.doi.org/10.1109/UIC-ATC-ScalCom-CBDCom-IoP.2015.224]
] for the recognition of walking, running, standing, sitting, lying, going upstairs and going downstairs activities with a reported accuracy between 77% and 99%. PNN method was used by the authors of [45Zhao KL, Du JZ, Li CQ, Zhang CL, Liu H, Xu C. Healthy: A Diary System Based on Activity Recognition Using Smartphone 2013.
[http://dx.doi.org/10.1109/MASS.2013.14]
] for the recognition of walking, jogging, cycling, going upstairs and going downstairs activities, reporting results with an average accuracy of 98%.

Following the studies that fuse the data acquired from the accelerometer and the gyroscope sensors, the authors of [46Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ. Simple and Complex Activity Recognition through Smart Phones 2012.
[http://dx.doi.org/10.1109/IE.2012.39]
] used the MLP method for the recognition of standing, walking, running, sitting, lying, cycling, driving, cleaning, cooking, taking medication, sweeping, washing hands and watering plants activities, reporting an average accuracy of 65.44%. The MLP method was used in [47Shen C, Chen YF, Yang GS. On Motion-Sensor Behavior Analysis for Human-Activity Recognition via Smartphones 2016 Ieee International Conference on Identity, Security and Behavior Analysis (Isba) Sendai, Japan. 2016; pp. 2016; 1-6.
[http://dx.doi.org/10.1109/ISBA.2016.7477231]
] for the recognition of walking, jumping, running, going downstairs and going upstairs, reporting an average accuracy of 84.53%. For the recognition of falling activities, the authors of [48Vallabh P, Malekian R, Ye N, Bogatinoska DC. Fall Detection Using Machine Learning Algorithms 2016.
[http://dx.doi.org/10.1109/SOFTCOM.2016.7772142]
] implemented the MLP method and reported an accuracy of 85.87%. The walking, jogging, sitting, going upstairs and going downstairs were recognized with the use of the MLP method, reporting an average accuracy of 71.25% [49Wu W, Dasgupta S, Ramirez EE, Peterson C, Norman GJ. Classification accuracies of physical activities using smartphone motion sensors. J Med Internet Res 2012; 14(5): e130.
[http://dx.doi.org/10.2196/jmir.2208] [PMID: 23041431]
]. In [50Fang L, Yishui S, Wei C. Up and down buses activity recognition using smartphone accelerometer Electronic and Automation Control Conference 2016; 761-5.
[http://dx.doi.org/10.1109/ITNEC.2016.7560464]
], the authors implemented the MLP method for the recognition of movements related to getting up and down from buses, reporting an average accuracy of 86%.

A system that uses Support Vector Machines (SVMs) with tuned parameters and features based on histograms extracted from multiple accelerometers and gyroscopes is proposed in [51Zdravevski E, Lameski P, Mingov R, Kulakov A, Gjorgjevikj D. Robust histogram-based feature engineering of time series data 2015.
[http://dx.doi.org/10.15439/2015F420]
]. This is a valid approach because such features are easy to compute. A balanced accuracy over 85% is reported.

An approach based on automatic feature engineering is used in [9Zdravevski E, Lameski P, Trajkovik V, Kulakov A, Chorbev I, Goleva R, et al. Improving Activity Recognition Accuracy in Ambient-Assisted Living Systems by Automated Feature Engineering. IEEE Access 2017; 5: 5262-80.
[http://dx.doi.org/10.1109/ACCESS.2017.2684913]
] for the recognition of a variety of ADLs, using sensors placed on multiple locations. Depending on the dataset, an accuracy from 90 to 99% is reported. Interestingly, the best accuracy achieved with SVMs, Random Forest and Extra Trees is comparable to the accuracy of much more lightweight approaches like logistic regression and Naïve Bayes. A similar approach was proven to work for jogging detection in [52Zdravevski E, Risteska Stojkoska B, Standl M, Schulz H. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions. PLoS One 2017; 12(9): e0184216.
[http://dx.doi.org/10.1371/journal.pone.0184216] [PMID: 28880923]
], even on field conditions when the subject wears accelerometers all the time.

In [53Tang CX, Phoha VV. An Empirical Evaluation of Activities and Classifiers for User Identification on Smartphones 2016.
[http://dx.doi.org/10.1109/BTAS.2016.7791159]
], the accelerometer and gravity sensors’ data were fused and the MLP method used for the recognition of walking, going upstairs, going downstairs, standing and laying activities, reporting an accuracy higher than 90%.

Combining the data acquired from the accelerometer and the GPS receiver, the MLP method handles the recognition of standing, walking, and travelling by car and train, reporting a minimum accuracy of 70% [54Bloch A, Erdin R, Meyer S, Keller T, de Spindler A. Battery-Efficient Transportation Mode Detection on Mobile Devices 2015.
[http://dx.doi.org/10.1109/MDM.2015.16]
].

In [55Shoaib M, Scholten H, Havinga PJM. Towards Physical Activity Recognition Using Smartphone Sensors 2013.
[http://dx.doi.org/10.1109/UIC-ATC.2013.43]
], the data acquired from the accelerometer, the magnetometer, the gyroscope and the GPS receiver were fused and the MLP method was applied for the recognition of running, walking, sitting, going upstairs, going downstairs and sitting activities, reporting a recognition accuracy higher than 70%.

The acoustic data may be exploited for the recognition of the different environments using ANN. Authors of [56Mielke M, Brück R. Smartphone application for automatic classification of environmental sound Proceedings of the 20th International Conference Mixed Design of Integrated Circuits and Systems - MIXDES 2013 2013; 512-5.] implemented the MLP method for the recognition of sounds of emergency vehicles, reporting an accuracy around 96.70%. In [57Guo X, Toyoda Y, Li H, Huang J, Ding S, Liu Y. Environmental sound recognition using time-frequency intersection patterns 2011.], several sounds have been recognized with the MLP method, such as boll impact, metal impact, wood impact, plastic impact, open/close door, typing, knocking, telephone ringing, grains falling, spray and whistle, reporting an average accuracy in environment recognition of 92.41%. The sounds of sneezing, dog barking, clock ticking, baby crying, crowing rooster, raining, sea waves, fire crackling, helicopter and chainsaw were recognized in [58Pillos A, Alghamidi K, Alzamel N, Pavlov V, Machanavajhala S. A real-time environmental sound recognition system for the Android OS 2016.] using the MLP method and reporting an accuracy around 94.5%. The authors of [59Bountourakis V, Vrysis L, Papanikolaou G. Machine Learning Algorithms for Environmental Sound Recognition: Towards Soundscape Semantics Proceedings of the Audio Mostly 2015 on Interaction With Sound Thessaloniki, Greece. 2015.2015.
[http://dx.doi.org/10.1145/2814895.2814905]
] recognized several sounds, including alarms, birds, clapping, dogs, motorcycles, raining, sea waves and wind, based on the application of the MLP method, and reported an accuracy around 54%. Other ANN methods have been used, such as Feedforward neural networks (FNN), Recurrent Neural Networks (RNN) and DNN methods. The FNN were implemented by the authors of [60Mielke M, Brueck R. Design and evaluation of a smartphone application for non-speech sound awareness for people with hearing loss. Conf Proc IEEE Eng Med Biol Soc 2015; 2015: 5008-11.
[http://dx.doi.org/10.1109/EMBC.2015.7319516]
] for the recognition of emergency vehicles, car horns, and normal street sounds, reporting an accuracy between 80% and 100%. The authors of [61Sigtia S, Stark AM, Krstulovic S, Plumbley MD. Automatic Environmental Sound Recognition: Performance Versus Computational Cost. IEEE/ACM Trans Audio Speech Lang Process 2016; 24: 2096-107.
[http://dx.doi.org/10.1109/TASLP.2016.2592698]
] implemented DNN and RNN methods for the recognition of the sounds related to baby crying and smoking alarm, reporting an accuracy around 90%. The DNN methods was also used for the recognition of different sounds, including laughing, singing, crying, arguing, sighing, voice and music, reporting accuracies around 80% [62Dubey H, Mehl MR, Mankodiya K. BigEAR: Inferring the ambient and emotional correlates from smartphone-based acoustic big data 2016 IEEE 1st International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE) 2016; 78-83.
[http://dx.doi.org/10.1109/CHASE.2016.46]
, 63Lane ND, Georgiev P, Qendro L. DeepEar: robust smartphone audio sensing in unconstrained acoustic environments using deep learning Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing Osaka, Japan. 2015.2015.
[http://dx.doi.org/10.1145/2750858.2804262]
].

The acoustic data may also be used to help in the recognition of some ADL; the authors of [64Cheffena M. Fall Detection Using Smartphone Audio Features. IEEE J Biomed Health Inform 2016; 20(4): 1073-80.
[http://dx.doi.org/10.1109/JBHI.2015.2425932] [PMID: 25915965]
] recognized the falling events with the MLP method, reporting an accuracy around 93%.

A summary of the literature overview discussed above is provided in Tables 1 to 5, where details regarding the number of ADLs analyzed by each study, and the accuracy obtained on average and on the single type of ADL considered are presented. However, the accuracy values reported by the studies analyzed are not directly comparable as they were obtained from different datasets.

Table 1
Summary of the literature studies related to ADL recognition based on the ANN method. The number of ADL analyzed, the average accuracy and the accuracy-related to each ADL are reported.


Table 2
Summary of the literature studies related to ADL recognition based on the ANN method. The number of ADL analyzed, the average accuracy and the accuracy-related to each ADL are reported. (cont.).


Table 3
Summary of the literature studies related to environment recognition based on the ANN method. The number of environments analyzed, the average accuracy and the accuracy-related to each environment are reported.


Table 4
Summary of the literature studies related to environment recognition based on the ANN method. The number of environments analyzed, the average accuracy and the accuracy-related to each environment are reported. (cont.).


Table 5
Summary of the literature studies related to environment recognition based on the ANN method. The number of environments analyzed, the average accuracy and the accuracy-related to each environment are reported. (cont.).


Taking into account the set of studies we selected and analyzed from the literature, as listed in Tables 1 to 5 and in the bibliography section of this work, we can conclude that the majority of the studies (51.43%) uses only the accelerometer for the recognition of ADL, and only 9 studies (25.71%) used acoustic data for the recognition of the environments. Nevertheless, 5 studies (14.29%) present methods that fuse the data acquired from the accelerometer and gyroscope sensors. Other combinations are investigated in the literature, such as the fusion of the data acquired from the accelerometer and GPS receiver (2.86%), the fusion of the data acquired from the accelerometer and the gravity sensors (2.86%), and the fusion of the data acquired from the accelerometer, the magnetometer, the gyroscope and the GPS receiver (2.86%).

The number of ADL recognized by each study varies between 1 and 13 but in 96% of the studies that number varies between 1 and 7. In half of that studies, the number of ADL recognized is up to 5 included. In case of the number of environments recognized by each study, they vary between 1 and 11, but its dispersion is enormous. In 62.5% of that studies, the number of environments recognized is up to 5, and the rest (37.5%) are 10 and 11.

Tables 1 to 5 present the mapping between the study analyses and the ADL and environments recognized, including the accuracies reported for further comparison with our study. The accuracies presented are related to each ADL and environment recognition.

Finally, Tables 6 to 7 summarize the average, standard deviation and the coefficient of variation of the standard deviation of the accuracy for each ADL and environment recognized in the previous work analyzed. The low coefficient of variation of standard deviation values (<30%) permit conclude that both average accuracy of ADL and environments recognized are representative.

Table 6
Average accuracy of the ADL recognized.


Table 7
Average accuracy of the environments recognized.


1.4. Purpose of this Study

Based on our previous work [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review], the purpose of this study consists in the development of an Android library that, depending on the number and type of sensors available onboard the mobile device, implements the framework previously designed for the recognition of the ADL and their environments exploiting the major number of sensors available in the mobile device used. The implementation should use lightweight methods in a mobile application in order to give a rapid feedback to the user with the data processing methods implemented locally on the mobile device.

The previous work explored the use of different implementations of ANN, such as MLP with Backpropagation, FNN method with Backpropagation and DNN method with normalized and non-normalized data. The framework implemented in this research is composed of several stages, these being: data acquisition, data processing, data fusion and classification methods. The data acquisition and processing depends on the types of sensors, where for the accelerometer, gyroscope and magnetometer data a low-pass filter is applied; for the acoustic data the Fast Fourier Transform (FFT) for the extraction of the relevant frequencies of the audio signal is applied; and for the location data no filters are applied as the raw data acquired is used for the measurement of the distance travelled [69Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Rodríguez ND. Validation Techniques for Sensor Data in Mobile Health Applications. J Sens 2016; 2016: 1687-725.
[http://dx.doi.org/10.1155/2016/2839372]
]. Following the application of the low-pass filter and the FFT to clean the data acquired, some features related to the sensors’ signals were extracted and fused. Finally, the classification methods were applied for the recognition of ADL and environments.

The recognition of ADL and their environments are separated into three stages. These are the recognition of common ADL (i.e., walking, going upstairs, going downstairs, running, standing and watching TV), the recognition of the environments (i.e., bar, classroom, bedroom, gym, hall, kitchen, library, street and living room), and the recognition of ADL without body motion (i.e., sleeping, watching TV and driving). Based on the outcomes of previous works, firstly, for the recognition of the common ADL the method that reported better results was DNN method with normalized data. Secondly, for the recognition of environments, the method that reported better results was FNN method with non-normalized data, but, when compared with the previous results, this method reports a low performance and accuracy during the implementation of the Android library, where the results obtained are better with DNN method with normalized data. Finally, for the recognition of ADL without body motion, the method that reported better accuracy was also DNN method with normalized data. The data used in this study is available in the ALLab MediaWiki [70August 2017- Multi-sensor data fusion in mobile devices for the identification of activities of daily living. Available from: https://allab.di.ubi.pt/ mediawiki/index.php/August_2017-_ Multi-sensor_data_fusion_in_mobile_devices_for_the_identification_of_activities_of_daily_living].

1.5. Structure of this Study

This section presents our motivation and related work for the development of an Android library for the recognition of ADL and their environments. The following sections present the methods used and results obtained with the developed Android library, including their discussion and comparison with research studies available in the literature with a similar purpose.

2. METHODS

2.1. Study Design and Participants

Off-the-shelf mobile devices are equipped with several sensors, e.g., accelerometer, gyroscope, magnetometer, microphone and GPS receiver, which allow the acquisition of several physical and physiological parameters.

This study is focused on the development of an Android library [71Bojinov H, Michalevsky Y, Nakibly G, Boneh D. Mobile device identification via sensor fingerprinting arXiv preprintarXiv 2014; 1408-16., 72Katevas K, Haddadi H, Tokarchuk L. Sensingkit: Evaluating the sensor power consumption in iOS devices. arxiv: 160.05576, 2016
[http://dx.doi.org/10.1109/IE.2016.50]
] for the processing of the sensors data’ and recognition of ADL and their environments. The mobile device used for the tests of the Android library was a BQ Aquarius device [73Smartphones. 2017. Available from: https://www.bq.com/pt/smartphones], but the sampling frequency of the sensors’ data acquisition is difficult to measure, because these devices have several constraints, mostly because as the system operates in a time-sharing manner, it is impossible to assure a steady microprocessor attention to the sensor related processes.

Subjects (15 males and 10 females) aged between 16 and 60 years old were selected for the data acquisition with a mobile application that uses the Android library developed. Independently of the gender, we have selected people with distinct lifestyles, where 10 subjects self-assessed their living style as mainly active and the other 15 subjects self-assessed their living style as mainly sedentary. During the use of the mobile application, the mobile device is performing other tasks, such as receiving calls, using the Internet connection and using other mobile applications. However, the major part of the time, the mobile device is in the front pocket of the user’s pants for the correct data acquisition.

The developed mobile application acquired the sensors’ data as a background service, acquiring and processing slots of 5 seconds of data. With only 5 seconds of data, the ADL and environments will be recognized. The data acquisition is performed every 5 minutes, and 2000 captures for each ADL (i.e., walking, running, going upstairs, standing/watching TV, going downstairs, driving and sleeping) and environment (i.e., bedroom, bar, classroom, gym, kitchen, living room, hall, street and library), which corresponds to 10000 seconds (approximately, 2.7 hours) for each ADL and environment. During the performance of the experiments, the user should label the ADL and environment acquired because these data were used to train and test the methods for the classification, which are implemented in the Android library.

2.2. Description of the Framework

Following the previous work that explains the architecture and the method with best results in the recognition of ADL and their environments [18Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices. Sensors (Basel) 2016; 16(2): 184.
[http://dx.doi.org/10.3390/s16020184] [PMID: 26848664]
-20Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. Identification of Activities of Daily Living Using Sensors Available in off-the-shelf Mobile Devices: Research and Hypothesis 2016.
[http://dx.doi.org/10.1007/978-3-319-40114-0_14]
, 65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review], the framework implemented is a collection of the methods that, according to the literature, provide the best performance on recognition of ADL and environments, subjected to the available sensors in the mobile device where the framework is executed. Thus, its development is separated into several modules, such as data acquisition, data processing, data fusion and classification methods.

Data acquisition methods are not provided by the Android library, performing only the data processing, data fusion and classification methods. Data acquisition of the accelerometer, gyroscope, magnetometer, microphone and GPS data should be performed with the standard methods of Android platform, using a mobile application. For the remaining modules, the Android library should be a function of the number of sensors available in the mobile device.

Data processing methods implemented in the Android library consist of the data cleaning and feature extraction methods. The data cleaning method used depends on the type of sensors, i.e., for accelerometer, gyroscope and magnetometer data, the low pass filter is applied, and, for the acoustic data, the FFT is applied for the extraction of the relevant frequencies. The feature extraction also depends on the type of sensors, i.e., for accelerometer, gyroscope and magnetometer data, the features extracted are the five greatest distances between the maximum peaks combined with the Average, Standard Deviation, Variance and Median of the maximum peaks, and the Standard Deviation, Average, Maximum, Minimum, Variance and Median of the raw signal. For the acoustic data, the features extracted are the 26 Mel-Frequency Cepstral Coefficients (MFCC) combined with the Standard Deviation, Average, Maximum, Minimum, Variance and Median of the raw signal. Finally, for the location data, the feature extracted is the distance travelled.

After the data fusion of the features extracted from the sensors’ data, the normalization and classification methods are applied in three stages:

  • Stage 1:
    • Sensors: accelerometer, gyroscope and/or magnetometer;
    • Purpose: Recognition of walking, standing, running, going upstairs and going downstairs activities using DNN method with normalized data.
  • Stage 2:
    • Sensors: Microphone;
    • Purpose: Recognition of bar, bedroom, classroom, gym, hall, kitchen, street, library and watching TV/living room environments using FNN method with non-normalized data and/or DNN method with normalized data.
  • Stage 3:
    • Sensors: accelerometer, gyroscope, magnetometer and/or GPS receiver;
    • Purpose: Recognition of watching TV, sleeping and driving activities using DNN method with normalized data.

Related to the stage 2, the previous work [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review] recommends the use of FNN method with non-normalized data. However, as it will be presented in the following sections, we verified that the accuracies could be improved by combining different methodologies:

  1. Use of FNN method with non-normalized data, and, when the results are not clearly identified, the use DNN method with normalized data for these cases may perform a better recognition of the environments (Fig. 1A);
  2. Use only the DNN method with normalized data (Fig. 1B).
Fig. (1)
Schema of the classification stage of the framework for the recognition of ADL and their environments.


The FNN method with non-normalized data was implemented, trained and tested with the Encog framework [74Encog Machine Learning Framework 2017. Available from: http://www.heatonresearch.com/encog/] in a Java desktop application, creating a neural network for further application in the mobile application. The DNN methods with normalized data were implemented, trained and tested with Deeplearning 4j framework [75Deeplearning4j: Open-source, Distributed Deep Learning for the JVM 2017. Available from: https://deeplearning4j.org/] in a Java desktop application, creating a neural network for further application in the mobile application. The train of the neural networks needs a significant processing effort, that is performed previously on a desktop computer, using only the method developed in the mobile application. There are several parameters used for the creation and training of the neural network, including the activation function, and the learning rate and momentum values used, which are presented in Table 8. These configurations may influence the speed of the training stage as well as the results obtained with the trained network file.

Table 8
Configurations of the classification methods.


2.3. Test of the Method

We created a set of methods to test the correct identification of the ADL and environments using the Android library we developed. For this purpose, we acquired the sensors’ signal with a mobile device related to around 2.7 hours for each ADL and environments, correspondent to 2000 records with a length of 5 seconds, which are labelled in the mobile application in order to allow the measurement of the accuracy of the Android library developed. We implemented unit tests to evaluate the previously acquired records, reporting the ADL or environment recognized with each record in order to measure the overall recognition accuracy (percentage) of the Android library as well as the recognition accuracy (percentage) of each ADL and environment with the Android library.

3. RESULTS

3.1. Prototype

The main result of this study consists of the development of an Android library that implemented the framework for the recognition of ADL and their environments. In addition, a mobile application, with a Graphic User Interface presented in Fig. (2), that embeds the Android library developed, was implemented for its validation.

Fig. (2)
Design of the interface of the Android application developed. Figures a) and b) present the main screen. Figure c) presents the calendar with the daily planning. Figure d) presents the form for the creation of a planning record. Figure e) presents a calendar that highlights the days that have ADL and environments recognized. Figure f) presents a list of ADL and environments performed each day.


The main functionality of the mobile application developed consists in the presentation of the current or the last ADL, environment or location recognized (Fig. 2A and B). Based on the calendar, the mobile application has a module to manage the ADL and environments planned (Fig. 2C and D), Finally, the mobile application includes a module to show the list of the ADL and environments recognized by each calendar day (Fig. 2F), including the highlight of the respective calendar days that included the ADL and environments recognized (Fig. 2E).

In the following sections, the validation and results of the Android library embedded in the mobile application will be shown, proving the reliability of the framework for the recognition of ADL and their environments.

3.2. Requirements

This study consists of the use of the sensors available in consumer mobile devices. Due to the different manufacturers in the market, these devices have a different number of sensors, and the Android library developed should be flexible and adaptable to the number of sensors effectively available, allowing their use with all devices available on the market.

To use our Android library, the mobile device should have a minimum number of sensors (i.e., accelerometer and microphone), but, depending on the number of sensors included in each mobile, the final accuracy of the recognition of ADL and their environments will be different. Thus, if the mobile device includes the GPS receiver, the mobile application will increase the number of ADL correctly recognized. And, if the mobile device has a network connection, the user’s location will be shown in the mobile application.

In conclusion, depending on the number of sensors available in the mobile device, the Android library is able to recognize between 5 and 7 ADL, and 9 environments, as shown in the next section.

3.3. Validation

Once the framework for the recognition of ADL and environments in the Android library was implemented, we tested both the different stages of the framework independently and the overall execution of the framework. For these tests, we took into account the different combinations of sensors possibly available in the mobile devices. These tests were performed with the dataset previously acquired and available at [70August 2017- Multi-sensor data fusion in mobile devices for the identification of activities of daily living. Available from: https://allab.di.ubi.pt/ mediawiki/index.php/August_2017-_ Multi-sensor_data_fusion_in_mobile_devices_for_the_identification_of_activities_of_daily_living].

Based on the implementation of stage 1 for the recognition of common ADL (i.e., standing, walking, running, going upstairs and going downstairs) using the DNN method with normalized data, we achieved an average accuracy of 86.39% with the accelerometer sensor (Table 9), 86.49% with the accelerometer and magnetometer sensors (Table 10), and 89.15% with the accelerometer, magnetometer and gyroscope sensors (Table 11).

Table 9
Analysis of the results of stage 1 with the accelerometer sensor.


Table 10
Analysis of the results of stage 1 with the accelerometer and magnetometer sensors.


During the implementation of stage 2 for the recognition of environments (i.e., bedroom, bar, classroom, gym, kitchen, watching TV/living room, hall, street and library), we verified that the FNN method with Backpropagation used in the previous study [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review], reported confused results. In fact, using the FNN method with Backpropagation, the framework reports an average accuracy of 22.68% in the recognition of the environments, recognizing only 4 out of 9 environments (Table 12). As the results were not good, we created a new method in the framework that first tries the recognition of the environments using an FNN method with Backpropagation, and, when the results are not clearly identified, automatically applies the DNN method with normalized data. This way, we obtained an average accuracy of 30.27% with a slowing down performance of the mobile device, recognizing 8 out of 9 environments (Table 13). Finally, we implemented the DNN method only with normalized data for the recognition of the environments, increasing the performance of the Android library, and reporting an average accuracy of 45.68% recognizing all the environments (Table 14). In the next analysis related to the recognition of the environments, the results obtained with the DNN only method will be taken into account.

Table 11
Analysis of the results of stage 1 with the accelerometer, magnetometer and gyroscope sensors.


Table 12
Analysis of the results of stage 2 using FNN method with Backpropagation.


Table 13
Analysis of the results of stage 2 using FNN method with Backpropagation and DNN method.


Based on the implementation of stage 3 for the recognition of ADL without body motion (i.e., watching TV, sleeping and driving), the results obtained are always 100%, but these results depend on the results obtained during the previous stages of the framework for the recognition of ADL and their environments, because the outcomes depend on the correct recognition of the environment (stage 2), the correct recognition of the ADL as standing (stage 1) and the correct extraction of the GPS features.

Finally, we tested the overall recognition of the ADL (i.e., walking, running, going upstairs, going downstairs, watching TV, sleeping and driving) with the sequential execution of the three stages using only the DNN method with normalized data, reporting an average accuracy of 58.02% with the use of the accelerometer and the environment recognized (Table 15); 58.34% with the use of the accelerometer, the magnetometer and the environment recognized (Table 16); 60.04% with the use of the accelerometer, the magnetometer, the gyroscope and the environment recognized (Table 17); 67.05% with the use of the accelerometer, the GPS receiver and the environment recognized (Table 18); 67.26% with the use of the accelerometer, the magnetometer, the GPS receiver and the environment recognized (Table 19), and, finally, 69.43% with the use of the accelerometer, the magnetometer, the gyroscope, the GPS receiver and the environment recognized (Table 20).

Table 14
Analysis of the results of stage 2 using the DNN method.


Table 15
Analysis of the overall recognition of the ADL with accelerometer and environment recognized.


Table 16
Analysis of the overall recognition of the ADL with accelerometer, magnetometer and environment recognized.


Table 17
Analysis of the overall recognition of the ADL with accelerometer, magnetometer, gyroscope and environment recognized.


Table 18
Analysis of the overall recognition of the ADL with an accelerometer, GPS receiver and environment recognized.


Table 19
Analysis of the overall recognition of the ADL with an accelerometer, magnetometer, GPS receiver and environment recognized.


Table 20
Analysis of the overall recognition of the ADL with an accelerometer, magnetometer, gyroscope, GPS receiver and environment recognized.


4. DISCUSSION

4.1. Main Findings

Based on the partial results obtained in the previous work [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review], we present a summary of the accuracies obtained in each stage of the framework for the recognition of ADL and their environments (Tables 21 to 23), for further comparison with the results obtained with the Android library developed for this study (Tables 24 to 26), which combines the different stages all together.

Table 21
Analysis of the accuracies reported in the studies [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review] using the accelerometer sensor.


Table 22
Analysis of the accuracies reported in the studies [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review] using the accelerometer and the magnetometer sensors.


Table 23
Analysis of the accuracies reported in the studies [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review] using the accelerometer, the magnetometer and the gyroscope sensors.


Table 24
Analysis of the accuracies reported by the Android library using the accelerometer sensor.


Table 25
Analysis of the accuracies reported by the Android library using the accelerometer and the magnetometer sensors.


Table 26
Analysis of the accuracies reported by the Android library using the accelerometer, the magnetometer and the gyroscope sensors.


Following the results obtained in the stage 1, related to the recognition of the common ADL, with the implementation of the DNN method with normalized data, we verified that, with the Android library developed, the ADL are recognized with more accuracy (+0.50%) with the accelerometer sensor, with the same accuracy with the joint accelerometer and magnetometer sensors, and with less accuracy (-0.36%) with the accelerometer, magnetometer and gyroscope sensors than the accuracies obtained in the previous work [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review]. Table 27 shows the values of the accuracy (A), the specificity (S), the recall (R) and the precision (P) of the recognition of each ADL as well as the average of these parameters in the Android library developed, verifying that the ADL with highest recognition accuracy is the standing with a precision of 97.11% and a recall of 99.25%, increasing the results in the framework, because it influences the results of stage 3.

Table 27
Recognition of each ADL in stage 1: mean values (%).


Following the results obtained in stage 2, related to the recognition of the environments, we tried to use the same method implemented in the previous works [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review], which is the FNN method with Backpropagation and non-normalized data, but instead of the 86.50% of the recognition accuracy, the Android library reported an accuracy of 22.68% and only 4 environments were recognized. However, due to the capabilities of the mobile devices, the recognition obtained was very low, and we tried to combine the use of the FNN method with Backpropagation and non-normalized data with the use of the DNN method with normalized data, which proves that the accuracy obtained increases to 30.27%, recognizing 8 of the 9 proposed environments. As we verified that the use of the DNN method with normalized data increases the recognition accuracy of the environments, we tried to implement only the DNN method with normalized data, reporting an accuracy of 45.68%, recognizing all of 9 environments with better performance, because the execution of the DNN method requires less resources than the execution of the FNN method with Backpropagation. Table 28 shows the values of the accuracy (A), the specificity (S), the recall (R) and the precision (P) of the recognition of each environment as well as the average of these parameters in the Android library developed, verifying that the results with the highest precision were achieved in the street and the bedroom environments and the highest recall value in the recognition of the bedroom environment.

Table 28
Recognition of each environment in stage 2: mean values (%).


Following the results obtained in stage 3, related to the recognition of the ADL without body motion, with Android library developed, the ADL without body motion is recognized with 100% of recognition accuracy, which is the same accuracy reported by the previous works [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review]. Tables 29 and 30 show the values of the accuracy (A), the specificity (S), the recall (R) and the precision (P) of the recognition of each ADL without body motion as well as the average of these parameters in the Android library developed, verifying that the values of the accuracy, recall, specificity and precision are always 100%.


Table 29
Statistical analysis of the recognition of each ADL in stage 3.


Table 30
Statistical analysis of the recognition of each ADL in stage 3 (cont.).


Following the overall results obtained by the framework in the previous works [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review], for the measurement of the overall accuracy, we took in account that when the ADL was recognized as an activity without motion, it was always correctly classified. Thus, using the same dataset for the tests, in the Android library, we tested the flow of the framework for the recognition of ADL, reporting accuracies between 58.02% and 86.49% with the accelerometer sensor, between 58.34% and 86.49% with the accelerometer and magnetometer sensors and between 60.04% and 89.15%, depending on the number of ADL recognized that varies between 5 and 7 ADL. Tables 31 and 32 show the values of the accuracy (A), the specificity (S), the recall (R) and the precision (P) of the recognition of each ADL as well as the average of these parameters in the Android library developed, verifying in general the results obtained reported highest precision and specificity, where the lower accuracies were reported for the going upstairs and downstairs activities.

Table 31
Statistical analysis of the recognition of each ADL in the framework.


Table 32
Statistical analysis of the recognition of each ADL in the framework (cont.).


One of the possible causes of the variability of the results between the previous work and the Android library developed may be the restrictions in hardware and software verified in the mobile devices, that will be analyzed in the next section.

4.2. Limitations

Due to the software and hardware limitations of the mobile devices, their use for the recognition of ADL and environments has several challenges, which are evidenced in several studies [76Santos J, Rodrigues JJPC, Silva BMC, Casal J, Saleem K, Denisov V. An IoT-based mobile gateway for intelligent personal assistants on mobile health environments. J Netw Comput Appl 2016; 71: 194-204.
[http://dx.doi.org/10.1016/j.jnca.2016.03.014]
-81Arif MJ, El Emary IM, Koutsouris DD. A review on the technologies and services used in the self-management of health and independent living of elderly. Technol Health Care 2014; 22(5): 677-87.
[http://dx.doi.org/10.3233/THC-140851] [PMID: 25134962]
]. These are:

  • The use of several types of connection (e.g., Wi-Fi and Bluetooth) decreases the usable battery time;
  • During the processing of the sensors’ data, the performance of the mobile device may decrease;
  • Due to the execution of several tasks at the same time and the reduced resources of these devices, the frequency of the data acquisition is not constant over the time;
  • The placement of the mobile device is one of the most important problems because it is very important for the acquisition of valid data, where there are no studies providing the absolute optimal position that can be good for any type of recognition one wants to perform because it depends on the purpose of the study;
  • If the processing of the sensors’ data is performed at a remote server, a permanent data connection should be available; this will also contribute to further decrease the usable battery time;
  • The number of sensors available in the consumer mobile devices is different by hardware version and manufacturers, but the developed methods should be a function of the number of sensors available on these devices; moreover, different manufacturers may report different error margins and even different output value ranges;
  • Regarding the use of multiple devices, the security, privacy and resilience of the data exchanged over the different types of network connections should be taken into account for the correct use of the mobile systems.

For the minimization of the effects of the aforementioned limitations, several strategies are proposed in our study, these are:

  • These research studies may only use the sensors available in consumer mobile devices, reducing the constraints with the connection to external sensors and the battery lifetime;
  • The use of lightweight methods may reduce the problems with the reduced power processing of these devices;
  • Another possible solution is to enable and disable the acquisition of the sensors’ data over the time because it is not necessary to have a continuous real-time sensor data acquisition in order to recognize the ADL and their environments;
  • The creation of methods adapted to the different sets of sensors available in the consumer mobile devices increases the availability of the methods for a large group of people;
  • The use of local data processing reduces the time to give feedback to the user, but it also reduces the problems with the security and privacy of the personal data, despite increasing the computation burden of the device.

4.3. Comparison with Prior Work

Based on the previous research studies available in the literature, Table 33 presents the comparison between the minimum, maximum and average of the accuracy achieved and the results reported by the Android library developed. The results that are comparable are highlighted in Table 33, verifying that the highest improvement was verified in the running activity, when we compared our results whit the average accuracy achieved in prior work. For that activity, the minimum and the maximum accuracy founded are 68,70% and 100% respectively, returned a positive (27,15%) and negative improvement (-4.15%).

Fig. (3)
Comparison between the minimum, maximum and average of the accuracies obtained in the literature review with the common ADL and environments recognized by the Android library developed.


Table 33
Comparison between the minimum, maximum and average of the accuracies obtained in the literature review and the accuracy of our results in the recognition of the selected ADL and environments.


Following the comparable results, presented in Fig. (3), we verified that our results are always higher than the average of the results obtained in the previous research studies in the literature.

However, the significance of the improvements (see OR-Avg in Table 33) was evaluated with the Student’s t-test for a sample. As presented in Table 34, the normality assumption was achieved, and it was assessed by the Shapiro-Wilk test (TSW(8)) = 0.889; p_value=0.228). SPSS Statistics Software (v. 24; IBM SPSS, Chicago, IL) [82Field A. Discovering statistics using IBM SPSS statistics 2013.] was used to perform the statistical tests. The improvements for which the p-value of the test is lower than 0.05 are considered statistically significant, at 95% level of confidence.

Table 34
Normality tests.


The results of the Student’s t-test for the sample used are presented in Table 35, reporting a value of the average of improvements equals to 2.93%. According to the Student’s t-test, this value is statistically significant (t(7)=3.062; p_value =0.018).

Table 35
Values of the Student's t-test for a sample mean for comparing our results and average accuracy.


In addition, the same procedure was done for analyze the improvements obtained when compared our results, whit the minimum and maximum result founded in the literature (see OR-Min and OR-Max in Table 33, respectively).

In Table 36, the normality assumption was achieved (Shapiro-Wilk test for OR-Min and OR-Max, respectively: (TSW(8) = 0.914; p_value = 0.381) and (TSW(8) = 0.968; p_value=0.880)).

Table 36
Normality tests.


Based in Table 37 the results of Student’s test show, in one hand that our results and the minimum accuracy found in the literature are statistically different (t(7) = 3.563; p_value = 0.009), and they are higher (average of the difference between OR and Min is equal 21.0775%). In the other hand, when comparing our results and the maximum achieved in literature, we can accept the negative average of the difference between them (average of the difference between OR and Max is equal -6.2825%) is equal zero (t(7) = -2.076; p_value = 0.077).

Table 37
Values of the Student's t-test for a sample mean for comparing our results with minimum, and our results with maximum accuracy.


Finally, based on the accuracies, presented in the Table 33, and the number of ADL/Environments recognized in the literature, presented in the Table 38, we can conclude that our method is better than other available in the literature review, because our improvement is reliable, because we recognize a major number of ADL/Environments than the other previous works and we obtained statistically significant results.

Table 38
Number of ADL/Environments recognized by the studies, where the minimum, maximum and average of the accuracies obtained in the literature review were verified.


In addition, we performed the comparison between out number of ADL/Environments recognized with the mean number of ADL/Environments recognized in the literature, evaluating the significance of our number of ADL/Environments recognized with the Student’s t-test for a sample. As presented in Table 39, the normality assumption was achieved, and it was assessed by the Shapiro-Wilk test (TSW(8)) = 0.889; p_value=0.228). SPSS Statistics Software (v. 24; IBM SPSS, Chicago, IL) [82Field A. Discovering statistics using IBM SPSS statistics 2013.] was used to perform the statistical tests. The improvements for which the p-value of the test is lower than 0.05 are considered statistically significant, at 95% level of confidence.

Table 39
Normality tests.


The results of the Student’s t-test for the sample used are presented in Table 40, reporting a value of the number of ADL/Environments recognized equals to 16. According to the Student’s t-test, this value is statistically significant (t(8)=0.339; p_value =0.000).

Table 40
Values of the Student's t-test for a sample mean for comparing our number of ADL/Environments with the average number of ADL/Environments recognized in the literature.


The development of the library has several challenges, including the limited battery lifetime and the reduced processing power of the mobile devices [47Shen C, Chen YF, Yang GS. On Motion-Sensor Behavior Analysis for Human-Activity Recognition via Smartphones 2016 Ieee International Conference on Identity, Security and Behavior Analysis (Isba) Sendai, Japan. 2016; pp. 2016; 1-6.
[http://dx.doi.org/10.1109/ISBA.2016.7477231]
]. However, the acquisition of 5 seconds of raw data every 5 minutes using the oldest devices tested has showed a minimum of 16 hours of battery life with a normal usage, and as currently the mobile devices required a daily recharge, it is possible to consider that the acquisition of the sensors’ data with this method can be implemented. The performance of the mobile device is only strictly affected during a small time every 5 minutes, and the effects in the performance are not significant. It will be probably sufficient for the characterization of lifestyles, but it may miss some important events that may not be identified, including falling activities. The development of a method that implements a more significant sampling strategy without decreasing the performance and the availability of resources at the mobile device while identifying a wider number of ADLs requires additional research.

The data processing is executed after the data acquisition for the preparation of the data for further classification in the last stage. The data processing will calculate the features from the raw data, and after that, if the minimum required data is available (i.e., the accelerometer data is correctly acquired), the calculated features will be sent to the classification module, where the verification of the ADL executed is performed and, in case of a similar pattern is not already trained in ANN created for the development of the method in [65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review], the data will be discarded. Nevertheless, the data is only stored until the classification, after which it will be automatically discarded, avoiding problems related to the storage capacity of these devices.

CONCLUSION

The automatic recognition of ADL and their environments may be performed with the sensors available in consumer mobile devices, including the accelerometer, the gyroscope, the magnetometer, the microphone and the GPS receiver. This paper consists of the creation of an Android library based on a framework for the recognition of ADL and their environments, created by properly merging methods and tools available at [18Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices. Sensors (Basel) 2016; 16(2): 184.
[http://dx.doi.org/10.3390/s16020184] [PMID: 26848664]
-20Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. Identification of Activities of Daily Living Using Sensors Available in off-the-shelf Mobile Devices: Research and Hypothesis 2016.
[http://dx.doi.org/10.1007/978-3-319-40114-0_14]
, 65Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review-68Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review]. The library developed should be a function of the number of sensors available in the mobile devices, and able to provide a rapid feedback to the user, thanks to the local processing of the sensors’ data.

The Android library implemented includes some stages of the framework, that have been already proposed, including data processing, data fusion and classification methods. The data acquisition stage has to be performed according to the methods supported by the Android Operating System in each device. Related to the data processing, proper data cleaning methods should be applied: a low pass filter should be applied for the motion and magnetic sensors’ data; FFT should be applied for the acoustic sensors’ data, and correct features should be extracted for further fusion and classification of the sensors data. After the data processing step, the features should be fused, normalized, and the DNN method should be applied. The structure of the framework for the classification is separated into three stages: the recognition of common ADL, the recognition of environments, and the recognition of ADL without body motion. This structure is possible because each stage uses different sensors’ data. Firstly, the recognition of common ADL may use the accelerometer, magnetometer and gyroscope sensors’ data with the DNN method, reporting an average accuracy between 86.39% and 89.15% for the recognition of going upstairs and downstairs, walking, running and standing. Secondly, the recognition of the environments may use the microphone data with the DNN method, reporting an accuracy of 45.68% for the recognition of different sound environments, including bedroom, cooking/kitchen, gym, classroom, hall, watching TV/living room, bar, library and street. Finally, the recognition of ADL without body motion may be performed with the environment recognized, and the accelerometer, magnetometer, gyroscope and GPS receiver sensors’ data, reporting an accuracy of 100% for the recognition of watching TV/living room, sleeping, driving and other undifferentiated standing activities. In summary, the overall accuracy of the Android library is between 58.02% and 89.15%, depending on the number of sensors used and the number of ADL and environments recognized.

There are several limitations related to the use of mobile devices that can be minimized with several techniques, including the use of lightweight methods and local processing. There are no other datasets available that can be used to simulate the results obtained with the framework, we developed because the results available in the literature are provided separately for ADL and environments. So, we compared the accuracies obtained, verifying that our results exhibit an overall improvement equals to 2.93% when compared with the average of accuracy found in the literature. We also compared our results with the minimum and the maximum accuracy available in the literature, that permit concludes that our results do not differ at the maximum, but they are statically higher at the minimum found in the literature.

To date, there are different types of mobile devices available in the market. Therefore, this study was focused on the implementation of an Android library for the recognition of ADL and their environments with the use of an Android smartphone. The several types of mobile devices have different architectures and sensors that may return different types of sensors’ data. Our method is only applicable when the mobile device is in the front pocket of the user’s pants, the other positions may return incorrect results. For the use of different devices (i.e., smartwatch), the method should be adapted, because the smartwatch is normally on the arm of the user and the acquired patterns of the sensors’ data are different according to the position and the hardware of these devices. The creation of a library that should be a function of the number of sensors, the positioning of device and different architectures existing in the market is a research that should be done in the future.

In conclusion, the Android library developed proves the reliability of the mobile devices for the recognition of ADL and their environments with high precision. The library may be used in future studies combining the recognition of the ADL and their environments with other purposes for the development of a personal digital life coach.

ETHICS APPROVAL AND CONSENT TO PARTICIPATE

Not applicable.

HUMAN AND ANIMAL RIGHTS

No animals/humans were used for studies that are the basis of this research.

CONSENT FOR PUBLICATION

Not applicable.

CONFLICTS OF INTEREST

The authors declare no conflict of interest, financial or otherwise.

ACKNOWLEDGEMENTS

This work was supported by FCT project UID/EEA/50008/2013 (Este trabalho foi suportado pelo projecto FCT UID/EEA/50008/2013).

The authors would also like to acknowledge the contribution of the COST Action IC1303 – AAPELE – Architectures, Algorithms and Protocols for Enhanced Living Environments.

REFERENCES

[1] Salazar LHA, Lacerda T, Nunes JV, Gresse von Wangenheim C. A systematic literature review on usability heuristics for mobile phones. Int J Mobile Hum Comput Interact 2013; 5: 50-61.
[http://dx.doi.org/10.4018/jmhci.2013040103]
[2] Foti D, Koketsu JS. Activities of daily living. Pedretti’s occupational therapy: Practical skills for physical dysfunction 2013; 7: 157-232.
[3] Trajkovik V, Vlahu-Gjorgievska E, Koceski S, Kulev I. General assisted living system architecture model International Conference on Mobile Networks and Management 2014; 329-43.
[4] Siegel C, Hochgatterer A, Dorner TE. Contributions of ambient assisted living for health and quality of life in the elderly and care services--a qualitative analysis from the experts’ perspective of care service professionals. BMC Geriatr 2014; 14: 112.
[http://dx.doi.org/10.1186/1471-2318-14-112] [PMID: 25326149]
[5] Botia JA, Villa A, Palma J. Ambient Assisted Living system for in-home monitoring of healthy independent elders. Expert Syst Appl 2012; 39: 8136-48.
[http://dx.doi.org/10.1016/j.eswa.2012.01.153]
[6] Garcia NM, Rodrigues JJPC, Elias DC, Dias MS. Ambient Assisted Living 2014.
[7] Dimitrievski A, Zdravevski E, Lameski P, Trajkovik V. A survey of Ambient Assisted Living systems: Challenges and opportunities Intelligent Computer Communication and Processing (ICCP), 2016 IEEE 12th International Conference on 2016; 49-53.
[http://dx.doi.org/10.1109/ICCP.2016.7737121]
[8] Sousa P, Sabugueiro D, Felizardo V, Couto R, Pires I, Garcia N. mHealth Sensors and Applications for Personal Aid Mobile Health 2015; 265-81.
[http://dx.doi.org/10.1007/978-3-319-12817-7_12]
[9] Zdravevski E, Lameski P, Trajkovik V, Kulakov A, Chorbev I, Goleva R, et al. Improving Activity Recognition Accuracy in Ambient-Assisted Living Systems by Automated Feature Engineering. IEEE Access 2017; 5: 5262-80.
[http://dx.doi.org/10.1109/ACCESS.2017.2684913]
[10] Cippitelli E, Gasparrini S, Gambi E, Spinsante S, Wåhslény J, Orhany I, et al. Time synchronization and data fusion for RGB-depth cameras and inertial sensors in AAL applications Communication Workshop (ICCW) 2015; 265-70.
[http://dx.doi.org/10.1109/ICCW.2015.7247189]
[11] Dobre C, Mavromoustakis Cx, Garcia N, Goleva RI, Mastorakis G. Ambient Assisted Living and Enhanced Living Environments: Principles, Technologies and Control 2016.
[12] Goleva R I, Garcia N M, Mavromoustakis C X, Dobre C, Mastorakis G, Stainov R, et al. AAL and ELE Platform Architecture 2017.
[http://dx.doi.org/10.1016/B978-0-12-805195-5.00008-9]
[13] Augusto JC, Huch M, Kameas A, et al. Eds. Handbook of Ambient Assisted Living: Technology for Healthcare, Rehabilitation and Well-being, Amsterdam, Netherlands: IOS Press 2012.
[14] Garcia NM. A Roadmap to the Design of a Personal Digital Life Coach 2016.
[http://dx.doi.org/10.1007/978-3-319-25733-4_3]
[15] Kara M, Lamouchi O, Ramdane-Cherif A. A Quality Model for the Evaluation AAL Systems 2017.
[http://dx.doi.org/10.1016/j.procs.2017.08.354]
[16] Spinsante S, Gambi E, Raffaeli L, Montanini L, Paciello L, Bevilacqua R, et al. Technology-based assistance of people with dementia: state of the art, open challenges, and future developments. Human Monitoring, Smart Health and Assisted Living: Techniques and Technologies 2017; 9: 55.
[17] M. 2021. (2018, Januray 2nd). Smartphone OS market share forecast 2014-2021 | Statistic. 2018. Available: https://www.statista.com/ statistics/272307/ market-share-forecast-for-smartphone-operating-systems/
[18] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices. Sensors (Basel) 2016; 16(2): 184.
[http://dx.doi.org/10.3390/s16020184] [PMID: 26848664]
[19] Pires IM, Garcia NM, Flórez-Revuelta F. Multi-sensor data fusion techniques for the identification of activities of daily living using mobile devices Proceedings of the ECMLPKDD 2015 Doctoral Consortium, European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases Porto, Portugal. 2015.2015.
[20] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. Identification of Activities of Daily Living Using Sensors Available in off-the-shelf Mobile Devices: Research and Hypothesis 2016.
[http://dx.doi.org/10.1007/978-3-319-40114-0_14]
[21] Ivascu T, Cincar K, Dinis A, Negru V. Activities of daily living and falls recognition and classification from the wearable sensors data 2017.
[http://dx.doi.org/10.1109/EHB.2017.7995502]
[22] Kilinc O, Dalzell A, Uluturk I, Uysal I. Inertia Based Recognition of Daily Activities with ANNs and Spectrotemporal Features 2015.
[http://dx.doi.org/10.1109/ICMLA.2015.220]
[23] Prabowo OM, Mutijarsa K, Supangkat SH. Missing data handling using machine learning for human activity recognition on mobile device 2016 International Conference on ICT For Smart Society (ICISS) 2016; 59-62.
[http://dx.doi.org/10.1109/ICTSS.2016.7792849]
[24] Bujari A, Licar B, Palazzi CE. Movement pattern recognition through smartphone’s accelerometer Consumer Communications and Networking Conference (CCNC) 2012; 502-6.
[http://dx.doi.org/10.1109/CCNC.2012.6181029]
[25] Vallabh P, Malekian R, Ye N, Bogatinoska DC. Fall detection using machine learning algorithms 2016.
[http://dx.doi.org/10.1109/SOFTCOM.2016.7772142]
[26] Pombo N, Garcia N, Bousson K. Classification techniques on computerized systems to predict and/or to detect Apnea: A systematic review. Comput Methods Programs Biomed 2017; 140: 265-74.
[http://dx.doi.org/10.1016/j.cmpb.2017.01.001] [PMID: 28254083]
[27] Drucker H, Cortes C, Jackel LD, LeCun Y, Vapnik V. Boosting and Other Ensemble Methods. Neural Comput 1994; 6: 1289-301.
[http://dx.doi.org/10.1162/neco.1994.6.6.1289]
[28] Lorenzi P, Rao R, Romano G, Kita A, Irrera F. Mobile Devices for the Real-Time Detection of Specific Human Motion Disorders. IEEE Sens J 2016; 16: 8220-7.
[http://dx.doi.org/10.1109/Jsen.2016.2530944]
[29] Bayat A, Pomplun M, Tran DA. A Study on Human Activity Recognition Using Accelerometer Data from Smartphones 9th International Conference on Future Networks and Communications (Fnc’14) / the 11th International Conference on Mobile Systems and Pervasive Computing (Mobispc’14) / Affiliated Workshops vol. 34: 450-7.2014;
[http://dx.doi.org/10.1016/j.procs.2014.07.009]
[30] Khalifa S, Hassan M, Seneviratne A. Feature selection for floor-changing activity recognition in multi-floor pedestrian navigation 2014.
[http://dx.doi.org/10.1109/ICMU.2014.6799049]
[31] Piyare R, Lee SR. Mobile Sensing Platform for Personal Health Management 18th Ieee International Symposium on Consumer Electronics (Isce 2014) 1-2.2014;
[http://dx.doi.org/10.1109/ISCE.2014.6884300]
[32] Kwon Y, Kang K, Bae C. Analysis and Evaluation of Smartphone-based Human Activity Recognition Using a Neural Network Approach. 2015 International Joint Conference on Neural Networks (IJCNN) 1-5.2015;
[http://dx.doi.org/10.1109/IJCNN.2015.7280494]
[33] Wang C, Xu Y, Zhang J, Yu W. SW-HMM: A Method for Evaluating Confidence of Smartphone-Based Activity Recognition 2016.
[http://dx.doi.org/10.1109/TrustCom.2016.0320]
[34] Hsu Y-W, Chen K-H, Yang J-J, Jaw F-S. Smartphone-based fall detection algorithm using feature extraction 2016.
[http://dx.doi.org/10.1109/CISP-BMEI.2016.7852959]
[35] Vavoulas G, Chatzaki C, Malliotakis T, Pediaditis M, Tsiknakis M. The MobiAct Dataset: Recognition of Activities of Daily Living using Smartphones Proceedings of the International Conference on Information and Communication Technologies for Ageing Well and E-Health (Ict4awe) 143-51.2016;
[http://dx.doi.org/10.5220/0005792401430151]
[36] Bujari A, Licar B, Palazzi CE. Movement Pattern Recognition through Smartphone’s Accelerometer, 2012 IEEE Consumer Communications and Networking Conference 2012; 502-6.
[37] Torres-Huitzil C, Nuno-Maganda M. Robust smartphone-based human activity recognition using a tri-axial accelerometer 2015.
[http://dx.doi.org/10.1109/LASCAS.2015.7250435]
[38] Bajpai A, Jilla V, Tiwari VN, Venkatesan SM, Narayanan R. Quantifiable fitness tracking using wearable devices. Conf Proc IEEE Eng Med Biol Soc 2015; 2015: 1633-7.
[http://dx.doi.org/10.1109/EMBC.2015.7318688] [PMID: 26736588]
[39] Zainudin MNS, Sulaiman MN, Mustapha N, Perumal T. Activity Recognition based on Accelerometer Sensor using Combinational Classifiers 2015 IEEE Conference on Open Systems (Icos) 68-73.2015;
[http://dx.doi.org/10.1109/ICOS.2015.7377280]
[40] Sen S, Rachuri K K, Mukherji A, Misra A. Did you take a break today? Detecting playing foosball using your smartwatch 2016.
[http://dx.doi.org/10.1109/PERCOMW.2016.7457165]
[41] Kwapisz JR, Weiss GM, Moore SA. Activity recognition using cell phone accelerometers. SIGKDD Explor 2011; 12: 74.
[http://dx.doi.org/10.1145/1964897.1964918]
[42] Mitchell E, Monaghan D, O’Connor NE. Classification of sporting activities using smartphone accelerometers. Sensors (Basel) 2013; 13(4): 5317-37.
[http://dx.doi.org/10.3390/s130405317] [PMID: 23604031]
[43] Li P, Wang Y, Tian Y, Zhou TS, Li JS. An Automatic User-Adapted Physical Activity Classification Method Using Smartphones. IEEE Trans Biomed Eng 2017; 64(3): 706-14.
[http://dx.doi.org/10.1109/TBME.2016.2573045] [PMID: 27249822]
[44] Zhang L, Wu X, Luo D. Real-Time Activity Recognition on Smartphones Using Deep Neural Networks 2015.
[http://dx.doi.org/10.1109/UIC-ATC-ScalCom-CBDCom-IoP.2015.224]
[45] Zhao KL, Du JZ, Li CQ, Zhang CL, Liu H, Xu C. Healthy: A Diary System Based on Activity Recognition Using Smartphone 2013.
[http://dx.doi.org/10.1109/MASS.2013.14]
[46] Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ. Simple and Complex Activity Recognition through Smart Phones 2012.
[http://dx.doi.org/10.1109/IE.2012.39]
[47] Shen C, Chen YF, Yang GS. On Motion-Sensor Behavior Analysis for Human-Activity Recognition via Smartphones 2016 Ieee International Conference on Identity, Security and Behavior Analysis (Isba) Sendai, Japan. 2016; pp. 2016; 1-6.
[http://dx.doi.org/10.1109/ISBA.2016.7477231]
[48] Vallabh P, Malekian R, Ye N, Bogatinoska DC. Fall Detection Using Machine Learning Algorithms 2016.
[http://dx.doi.org/10.1109/SOFTCOM.2016.7772142]
[49] Wu W, Dasgupta S, Ramirez EE, Peterson C, Norman GJ. Classification accuracies of physical activities using smartphone motion sensors. J Med Internet Res 2012; 14(5): e130.
[http://dx.doi.org/10.2196/jmir.2208] [PMID: 23041431]
[50] Fang L, Yishui S, Wei C. Up and down buses activity recognition using smartphone accelerometer Electronic and Automation Control Conference 2016; 761-5.
[http://dx.doi.org/10.1109/ITNEC.2016.7560464]
[51] Zdravevski E, Lameski P, Mingov R, Kulakov A, Gjorgjevikj D. Robust histogram-based feature engineering of time series data 2015.
[http://dx.doi.org/10.15439/2015F420]
[52] Zdravevski E, Risteska Stojkoska B, Standl M, Schulz H. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions. PLoS One 2017; 12(9): e0184216.
[http://dx.doi.org/10.1371/journal.pone.0184216] [PMID: 28880923]
[53] Tang CX, Phoha VV. An Empirical Evaluation of Activities and Classifiers for User Identification on Smartphones 2016.
[http://dx.doi.org/10.1109/BTAS.2016.7791159]
[54] Bloch A, Erdin R, Meyer S, Keller T, de Spindler A. Battery-Efficient Transportation Mode Detection on Mobile Devices 2015.
[http://dx.doi.org/10.1109/MDM.2015.16]
[55] Shoaib M, Scholten H, Havinga PJM. Towards Physical Activity Recognition Using Smartphone Sensors 2013.
[http://dx.doi.org/10.1109/UIC-ATC.2013.43]
[56] Mielke M, Brück R. Smartphone application for automatic classification of environmental sound Proceedings of the 20th International Conference Mixed Design of Integrated Circuits and Systems - MIXDES 2013 2013; 512-5.
[57] Guo X, Toyoda Y, Li H, Huang J, Ding S, Liu Y. Environmental sound recognition using time-frequency intersection patterns 2011.
[58] Pillos A, Alghamidi K, Alzamel N, Pavlov V, Machanavajhala S. A real-time environmental sound recognition system for the Android OS 2016.
[59] Bountourakis V, Vrysis L, Papanikolaou G. Machine Learning Algorithms for Environmental Sound Recognition: Towards Soundscape Semantics Proceedings of the Audio Mostly 2015 on Interaction With Sound Thessaloniki, Greece. 2015.2015.
[http://dx.doi.org/10.1145/2814895.2814905]
[60] Mielke M, Brueck R. Design and evaluation of a smartphone application for non-speech sound awareness for people with hearing loss. Conf Proc IEEE Eng Med Biol Soc 2015; 2015: 5008-11.
[http://dx.doi.org/10.1109/EMBC.2015.7319516]
[61] Sigtia S, Stark AM, Krstulovic S, Plumbley MD. Automatic Environmental Sound Recognition: Performance Versus Computational Cost. IEEE/ACM Trans Audio Speech Lang Process 2016; 24: 2096-107.
[http://dx.doi.org/10.1109/TASLP.2016.2592698]
[62] Dubey H, Mehl MR, Mankodiya K. BigEAR: Inferring the ambient and emotional correlates from smartphone-based acoustic big data 2016 IEEE 1st International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE) 2016; 78-83.
[http://dx.doi.org/10.1109/CHASE.2016.46]
[63] Lane ND, Georgiev P, Qendro L. DeepEar: robust smartphone audio sensing in unconstrained acoustic environments using deep learning Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing Osaka, Japan. 2015.2015.
[http://dx.doi.org/10.1145/2750858.2804262]
[64] Cheffena M. Fall Detection Using Smartphone Audio Features. IEEE J Biomed Health Inform 2016; 20(4): 1073-80.
[http://dx.doi.org/10.1109/JBHI.2015.2425932] [PMID: 25915965]
[65] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. User Environment Detection with Acoustic Sensors Embedded on Mobile Devices for the Recognition of Activities of Daily Living 2017. In Review
[66] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F. A Multiple Source Framework for the Identification of Activities of Daily Living Based on Mobile Device Data 2017. In Review
[67] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Pattern Recognition Techniques for the Identification of Activities of Daily Living using Mobile Device Accelerometer 2017. In Review
[68] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Spinsante S. Data Fusion on Motion and Magnetic Sensors embedded on Mobile Devices for the Identification of Activities of Daily Living engrxiv.org/x4r5z2017. In Review
[69] Pires IM, Garcia NM, Pombo N, Flórez-Revuelta F, Rodríguez ND. Validation Techniques for Sensor Data in Mobile Health Applications. J Sens 2016; 2016: 1687-725.
[http://dx.doi.org/10.1155/2016/2839372]
[70] August 2017- Multi-sensor data fusion in mobile devices for the identification of activities of daily living. Available from: https://allab.di.ubi.pt/ mediawiki/index.php/August_2017-_ Multi-sensor_data_fusion_in_mobile_devices_for_the_identification_of_activities_of_daily_living
[71] Bojinov H, Michalevsky Y, Nakibly G, Boneh D. Mobile device identification via sensor fingerprinting arXiv preprintarXiv 2014; 1408-16.
[72] Katevas K, Haddadi H, Tokarchuk L. Sensingkit: Evaluating the sensor power consumption in iOS devices. arxiv: 160.05576, 2016
[http://dx.doi.org/10.1109/IE.2016.50]
[73] Smartphones. 2017. Available from: https://www.bq.com/pt/smartphones
[74] Encog Machine Learning Framework 2017. Available from: http://www.heatonresearch.com/encog/
[75] Deeplearning4j: Open-source, Distributed Deep Learning for the JVM 2017. Available from: https://deeplearning4j.org/
[76] Santos J, Rodrigues JJPC, Silva BMC, Casal J, Saleem K, Denisov V. An IoT-based mobile gateway for intelligent personal assistants on mobile health environments. J Netw Comput Appl 2016; 71: 194-204.
[http://dx.doi.org/10.1016/j.jnca.2016.03.014]
[77] Bert F, Giacometti M, Gualano MR, Siliquini R. Smartphones and health promotion: a review of the evidence. J Med Syst 2014; 38(1): 9995.
[http://dx.doi.org/10.1007/s10916-013-9995-7] [PMID: 24346929]
[78] Poslad S. Ubiquitous computing: smart devices, environments and interactions 2011.
[79] Choudhury T, Borriello G, Consolvo S, Haehnel D, Harrison B, Hemingway B, et al. The Mobile Sensing Platform: An Embedded Activity Recognition System. IEEE Pervasive Comput 2008; 7: 32-41.
[http://dx.doi.org/10.1109/MPRV.2008.39]
[80] Montoro-Manrique G, Haya-Coll P, Schnelle-Walka D. Internet of Things: From RFID Systems to Smart Applications 2011.
[81] Arif MJ, El Emary IM, Koutsouris DD. A review on the technologies and services used in the self-management of health and independent living of elderly. Technol Health Care 2014; 22(5): 677-87.
[http://dx.doi.org/10.3233/THC-140851] [PMID: 25134962]
[82] Field A. Discovering statistics using IBM SPSS statistics 2013.

Endorsements



"Open access will revolutionize 21st century knowledge work and accelerate the diffusion of ideas and evidence that support just in time learning and the evolution of thinking in a number of disciplines."


Daniel Pesut
(Indiana University School of Nursing, USA)

"It is important that students and researchers from all over the world can have easy access to relevant, high-standard and timely scientific information. This is exactly what Open Access Journals provide and this is the reason why I support this endeavor."


Jacques Descotes
(Centre Antipoison-Centre de Pharmacovigilance, France)

"Publishing research articles is the key for future scientific progress. Open Access publishing is therefore of utmost importance for wider dissemination of information, and will help serving the best interest of the scientific community."


Patrice Talaga
(UCB S.A., Belgium)

"Open access journals are a novel concept in the medical literature. They offer accessible information to a wide variety of individuals, including physicians, medical students, clinical investigators, and the general public. They are an outstanding source of medical and scientific information."


Jeffrey M. Weinberg
(St. Luke's-Roosevelt Hospital Center, USA)

"Open access journals are extremely useful for graduate students, investigators and all other interested persons to read important scientific articles and subscribe scientific journals. Indeed, the research articles span a wide range of area and of high quality. This is specially a must for researchers belonging to institutions with limited library facility and funding to subscribe scientific journals."


Debomoy K. Lahiri
(Indiana University School of Medicine, USA)

"Open access journals represent a major break-through in publishing. They provide easy access to the latest research on a wide variety of issues. Relevant and timely articles are made available in a fraction of the time taken by more conventional publishers. Articles are of uniformly high quality and written by the world's leading authorities."


Robert Looney
(Naval Postgraduate School, USA)

"Open access journals have transformed the way scientific data is published and disseminated: particularly, whilst ensuring a high quality standard and transparency in the editorial process, they have increased the access to the scientific literature by those researchers that have limited library support or that are working on small budgets."


Richard Reithinger
(Westat, USA)

"Not only do open access journals greatly improve the access to high quality information for scientists in the developing world, it also provides extra exposure for our papers."


J. Ferwerda
(University of Oxford, UK)

"Open Access 'Chemistry' Journals allow the dissemination of knowledge at your finger tips without paying for the scientific content."


Sean L. Kitson
(Almac Sciences, Northern Ireland)

"In principle, all scientific journals should have open access, as should be science itself. Open access journals are very helpful for students, researchers and the general public including people from institutions which do not have library or cannot afford to subscribe scientific journals. The articles are high standard and cover a wide area."


Hubert Wolterbeek
(Delft University of Technology, The Netherlands)

"The widest possible diffusion of information is critical for the advancement of science. In this perspective, open access journals are instrumental in fostering researches and achievements."


Alessandro Laviano
(Sapienza - University of Rome, Italy)

"Open access journals are very useful for all scientists as they can have quick information in the different fields of science."


Philippe Hernigou
(Paris University, France)

"There are many scientists who can not afford the rather expensive subscriptions to scientific journals. Open access journals offer a good alternative for free access to good quality scientific information."


Fidel Toldrá
(Instituto de Agroquimica y Tecnologia de Alimentos, Spain)

"Open access journals have become a fundamental tool for students, researchers, patients and the general public. Many people from institutions which do not have library or cannot afford to subscribe scientific journals benefit of them on a daily basis. The articles are among the best and cover most scientific areas."


M. Bendandi
(University Clinic of Navarre, Spain)

"These journals provide researchers with a platform for rapid, open access scientific communication. The articles are of high quality and broad scope."


Peter Chiba
(University of Vienna, Austria)

"Open access journals are probably one of the most important contributions to promote and diffuse science worldwide."


Jaime Sampaio
(University of Trás-os-Montes e Alto Douro, Portugal)

"Open access journals make up a new and rather revolutionary way to scientific publication. This option opens several quite interesting possibilities to disseminate openly and freely new knowledge and even to facilitate interpersonal communication among scientists."


Eduardo A. Castro
(INIFTA, Argentina)

"Open access journals are freely available online throughout the world, for you to read, download, copy, distribute, and use. The articles published in the open access journals are high quality and cover a wide range of fields."


Kenji Hashimoto
(Chiba University, Japan)

"Open Access journals offer an innovative and efficient way of publication for academics and professionals in a wide range of disciplines. The papers published are of high quality after rigorous peer review and they are Indexed in: major international databases. I read Open Access journals to keep abreast of the recent development in my field of study."


Daniel Shek
(Chinese University of Hong Kong, Hong Kong)

"It is a modern trend for publishers to establish open access journals. Researchers, faculty members, and students will be greatly benefited by the new journals of Bentham Science Publishers Ltd. in this category."


Jih Ru Hwu
(National Central University, Taiwan)


Browse Contents




Webmaster Contact: info@benthamopen.net
Copyright © 2019 Bentham Open