English español
Search
 

Repositorio de la Universidad de Oviedo. > Producción Bibliográfica de UniOvi: RECOPILA > Artículos >

Please use this identifier to cite or link to this item: http://hdl.handle.net/10651/45034

Title: Multi sensor system for pedestrian tracking and activity recognition in indoor environments
Author(s): Marrón, Juan José
Labrador, Miguel A.
Menéndez Valle, Adrián
Fernández Lanvin, Daniel
González Rodríguez, Bernardo Martín
Keywords: Ubiquitous localization
Smartphones
Sensor fusion
Pervasive computing
Issue date: Jun-2016
Publisher: Inderscience
Publisher version: http://dx.doi.org/10.1504/IJAHUC.2016.10000202
Citation: International Journal of Ad Hoc and Ubiquitous Computing, 23(1/2), p. 3-23 (2016); doi:10.1504/IJAHUC.2016.10000202
Format extent: P. 3-23
Abstract: The widespread use of mobile devices and the rise of Global Navigation Satellite Systems (GNSS) have allowed mobile tracking applications to become very popular and valuable in outdoor environments. However, tracking pedestrians in indoor environments with Global Positioning System (GPS)-based schemes is still very challenging. Along with indoor tracking, the ability to recognize pedestrian behavior and activities can lead to considerable growth in location-based applications including pervasive healthcare, leisure and guide services (such as, hospitals, museums, airports, etc.), and emergency services, among the most important ones. This paper presents a system for pedestrian tracking and activity recognition in indoor environments using exclusively common off-the-shelf sensors embedded in smartphones (accelerometer, gyroscope, magnetometer and barometer). The proposed system combines the knowledge found in biomechanical patterns of the human body while accomplishing basic activities, such as walking or climbing stairs up and down, along with identifiable signatures that certain indoor locations (such as turns or elevators) introduce on sensing data. The system was implemented and tested on Android-based mobile phones. The system detects and counts steps with an accuracy of 97% and 96:67% in flat floor and stairs, respectively; detects user changes of direction and altitude with 98:88% and 96:66% accuracy, respectively; and recognizes the proposed human activities with a 95% accuracy. All modules combined lead to a total tracking accuracy of 91:06% in common human motion indoor displacements
URI: http://hdl.handle.net/10651/45034
ISSN: 1743-8233
1743-8225
Sponsored: This work has been partially funded by the Department of Science and Technology (Spain) under the National Program for Research, Development and Innovation (projects TIN2011-25978 entitled Obtaining Adaptable, Robust and Efficient Software by including Structural Reflection to Statically Typed Programming Languages and TIN2009-12132 entitled SHUBAI: Augmented Accessibility for Handicapped Users in Ambient Intelligence and in Urban computing environments) and by the Principality of Asturias to support the Computational Reflection research group, project code GRUPIN14- 100
Project id.: TIN2011-25978
TIN2009-12132
Principado de Asturias/GRUPIN14-100
Appears in Collections:Artículos
Informática
Investigaciones y Documentos OpenAIRE

Files in This Item:

File Description SizeFormat
paper Draft.pdfPreprint1,38 MBAdobe PDFView/Open


Exportar a Mendeley


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

 

Base de Datos de Autoridades Biblioteca Universitaria Consultas / Sugerencias