Android Library for Recognition of Activities of Daily Living: Implementation Considerations, Challenges, and Solutions
Ivan Miguel Pires1, 2, 3, *, Maria Canavarro Teixeira4, 5, Nuno Pombo1, Nuno M. Garcia1, Francisco Flórez-Revuelta6, Susanna Spinsante7, Rossitza Goleva8, Eftim Zdravevski9
1 Instituto de Telecomunicações, Universidade da Beira Interior, Covilhã, Portugal
2 Altran portugal, Lisbon, Portugal
3 ALLab - Assisted Living Computing and Telecommunications Laboratory, Computing Science Department, Universidade da Beira Interior, Covilhã, Portugal
4 UTC de Recursos Naturais e Desenvolvimento Sustentável, Polytechnique Institute of Castelo Branco, Castelo Branco, Portugal
5 CERNAS - Research Centre for Natural Resources, Environment and Society, Polytechnique Institute of Castelo Branco, Castelo Branco, Portugal
6 Department of Computer Technology, Universidad de Alicante, Alicante, Spain
7 Department of Information Engineering, Marche Polytechnic University, Ancona, Italy
8 Department of Informatics, New Bulgarian University, Sofia, Bulgaria
9 Faculty of Computer Science and Engineering, University Ss Cyril and Methodius, Skopje, Macedonia
Off-the-shelf-mobile devices have several sensors available onboard that may be used for the recognition of Activities of Daily Living (ADL) and the environments where they are performed. This research is focused on the development of Ambient Assisted Living (AAL) systems, using mobile devices for the acquisition of the different types of data related to the physical and physiological conditions of the subjects and the environments. Mobile devices with the Android Operating Systems are the least expensive and exhibit the biggest market while providing a variety of models and onboard sensors.
This paper describes the implementation considerations, challenges and solutions about a framework for the recognition of ADL and the environments, provided as an Android library. The framework is a function of the number of sensors available in different mobile devices and utilizes a variety of activity recognition algorithms to provide a rapid feedback to the user.
The Android library includes data fusion, data processing, features engineering and classification methods. The sensors that may be used are the accelerometer, the gyroscope, the magnetometer, the Global Positioning System (GPS) receiver and the microphone. The data processing includes the application of data cleaning methods and the extraction of features, which are used with Deep Neural Networks (DNN) for the classification of ADL and environment. Throughout this work, the limitations of the mobile devices were explored and their effects have been minimized.
The implementation of the Android library reported an overall accuracy between 58.02% and 89.15%, depending on the number of sensors used and the number of ADL and environments recognized. Compared with the results available in the literature, the performance of the library reported a mean improvement of 2.93%, and they do not differ at the maximum found in prior work, that based on the Student’s t-test.
This study proves that ADL like walking, going upstairs and downstairs, running, watching TV, driving, sleeping and standing activities, and the bedroom, cooking/kitchen, gym, classroom, hall, living room, bar, library and street environments may be recognized with the sensors available in off-the-shelf mobile devices. Finally, these results may act as a preliminary research for the development of a personal digital life coach with a multi-sensor mobile device commonly used daily.
Keywords: Activities of daily living, Sensors, Mobile devices, Pattern recognition, Data fusion, Android library, Artificial neural networks, Recognition.
open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: (https://creativecommons.org/licenses/by/4.0/legalcode). This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
* Address correspondence to this author at the Computer Science Department, Universidade da Beira Interior, Rua Marquês d’Ávila e Bolama, 6201-001 Covilhã, Portugal; Tel: +351 966379785; E-mail: email@example.com