Show simple item record

Multi sensor system for pedestrian tracking and activity recognition in indoor environments

dc.contributor.authorMarrón, Juan José
dc.contributor.authorLabrador, Miguel A.
dc.contributor.authorMenéndez Valle, Adrián
dc.contributor.authorFernández Lanvin, Daniel 
dc.contributor.authorGonzález Rodríguez, Bernardo Martín 
dc.date.accessioned2018-01-05T08:26:33Z
dc.date.available2018-01-05T08:26:33Z
dc.date.issued2016-06
dc.identifier.citationInternational Journal of Ad Hoc and Ubiquitous Computing, 23(1/2), p. 3-23 (2016); doi:10.1504/IJAHUC.2016.10000202
dc.identifier.issn1743-8233
dc.identifier.issn1743-8225
dc.identifier.urihttp://hdl.handle.net/10651/45034
dc.description.abstractThe widespread use of mobile devices and the rise of Global Navigation Satellite Systems (GNSS) have allowed mobile tracking applications to become very popular and valuable in outdoor environments. However, tracking pedestrians in indoor environments with Global Positioning System (GPS)-based schemes is still very challenging. Along with indoor tracking, the ability to recognize pedestrian behavior and activities can lead to considerable growth in location-based applications including pervasive healthcare, leisure and guide services (such as, hospitals, museums, airports, etc.), and emergency services, among the most important ones. This paper presents a system for pedestrian tracking and activity recognition in indoor environments using exclusively common off-the-shelf sensors embedded in smartphones (accelerometer, gyroscope, magnetometer and barometer). The proposed system combines the knowledge found in biomechanical patterns of the human body while accomplishing basic activities, such as walking or climbing stairs up and down, along with identifiable signatures that certain indoor locations (such as turns or elevators) introduce on sensing data. The system was implemented and tested on Android-based mobile phones. The system detects and counts steps with an accuracy of 97% and 96:67% in flat floor and stairs, respectively; detects user changes of direction and altitude with 98:88% and 96:66% accuracy, respectively; and recognizes the proposed human activities with a 95% accuracy. All modules combined lead to a total tracking accuracy of 91:06% in common human motion indoor displacementsspa
dc.description.sponsorshipThis work has been partially funded by the Department of Science and Technology (Spain) under the National Program for Research, Development and Innovation (projects TIN2011-25978 entitled Obtaining Adaptable, Robust and Efficient Software by including Structural Reflection to Statically Typed Programming Languages and TIN2009-12132 entitled SHUBAI: Augmented Accessibility for Handicapped Users in Ambient Intelligence and in Urban computing environments) and by the Principality of Asturias to support the Computational Reflection research group, project code GRUPIN14- 100spa
dc.format.extentP. 3-23spa
dc.language.isoengspa
dc.publisherIndersciencespa
dc.relation.ispartofInternational Journal of Ad Hoc and Ubiquitous Computing, 23(1/2)spa
dc.rights© Inderscience
dc.subjectUbiquitous localizationspa
dc.subjectSmartphonesspa
dc.subjectSensor fusionspa
dc.subjectPervasive computingspa
dc.titleMulti sensor system for pedestrian tracking and activity recognition in indoor environmentsspa
dc.typeinfo:eu-repo/semantics/articlespa
dc.identifier.doi10.1504/IJAHUC.2016.10000202
dc.type.dcmitextspa
dc.relation.projectIDTIN2011-25978
dc.relation.projectIDTIN2009-12132
dc.relation.projectIDPrincipado de Asturias/GRUPIN14-100
dc.relation.publisherversionhttp://dx.doi.org/10.1504/IJAHUC.2016.10000202spa
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessspa


Files in this item

untranslated

This item appears in the following Collection(s)

Show simple item record