A signal processing perspective on human gait: Decoupling walking oscillations and gestures

Adrien Gregorj, Zeynep Yucel, Sunao Hara, Akito Monden, Masahiro Shiomi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals. Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.

Original languageEnglish
Title of host publicationInteractive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings
EditorsGerhard Rigoll, Andrey Ronzhin, Roman Meshcheryakov
PublisherSpringer Verlag
Pages75-85
Number of pages11
ISBN (Print)9783030261177
DOIs
Publication statusPublished - Jan 1 2019
Event4th International Conference on Interactive Collaborative Robotics, ICR 2019 - Istanbul, Turkey
Duration: Aug 20 2019Aug 25 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11659 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference4th International Conference on Interactive Collaborative Robotics, ICR 2019
CountryTurkey
CityIstanbul
Period8/20/198/25/19

Fingerprint

Speech processing
Gesture recognition
Gait
Gesture
Decoupling
Mobile robots
Signal Processing
Signal processing
Cameras
Robots
Oscillation
Testing
Chemical analysis
Speech Processing
Gesture Recognition
Motion
Field of View
Interaction
Mobile Robot
Periodicity

Keywords

  • Gesture
  • Group coordination
  • Pedestrian
  • Social robot

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Gregorj, A., Yucel, Z., Hara, S., Monden, A., & Shiomi, M. (2019). A signal processing perspective on human gait: Decoupling walking oscillations and gestures. In G. Rigoll, A. Ronzhin, & R. Meshcheryakov (Eds.), Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings (pp. 75-85). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11659 LNAI). Springer Verlag. https://doi.org/10.1007/978-3-030-26118-4_8

A signal processing perspective on human gait : Decoupling walking oscillations and gestures. / Gregorj, Adrien; Yucel, Zeynep; Hara, Sunao; Monden, Akito; Shiomi, Masahiro.

Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings. ed. / Gerhard Rigoll; Andrey Ronzhin; Roman Meshcheryakov. Springer Verlag, 2019. p. 75-85 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11659 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Gregorj, A, Yucel, Z, Hara, S, Monden, A & Shiomi, M 2019, A signal processing perspective on human gait: Decoupling walking oscillations and gestures. in G Rigoll, A Ronzhin & R Meshcheryakov (eds), Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11659 LNAI, Springer Verlag, pp. 75-85, 4th International Conference on Interactive Collaborative Robotics, ICR 2019, Istanbul, Turkey, 8/20/19. https://doi.org/10.1007/978-3-030-26118-4_8
Gregorj A, Yucel Z, Hara S, Monden A, Shiomi M. A signal processing perspective on human gait: Decoupling walking oscillations and gestures. In Rigoll G, Ronzhin A, Meshcheryakov R, editors, Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings. Springer Verlag. 2019. p. 75-85. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-26118-4_8
Gregorj, Adrien ; Yucel, Zeynep ; Hara, Sunao ; Monden, Akito ; Shiomi, Masahiro. / A signal processing perspective on human gait : Decoupling walking oscillations and gestures. Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings. editor / Gerhard Rigoll ; Andrey Ronzhin ; Roman Meshcheryakov. Springer Verlag, 2019. pp. 75-85 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{1252de7d837148d1a1f189f39a02413f,
title = "A signal processing perspective on human gait: Decoupling walking oscillations and gestures",
abstract = "This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals. Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.",
keywords = "Gesture, Group coordination, Pedestrian, Social robot",
author = "Adrien Gregorj and Zeynep Yucel and Sunao Hara and Akito Monden and Masahiro Shiomi",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/978-3-030-26118-4_8",
language = "English",
isbn = "9783030261177",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "75--85",
editor = "Gerhard Rigoll and Andrey Ronzhin and Roman Meshcheryakov",
booktitle = "Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings",

}

TY - GEN

T1 - A signal processing perspective on human gait

T2 - Decoupling walking oscillations and gestures

AU - Gregorj, Adrien

AU - Yucel, Zeynep

AU - Hara, Sunao

AU - Monden, Akito

AU - Shiomi, Masahiro

PY - 2019/1/1

Y1 - 2019/1/1

N2 - This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals. Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.

AB - This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals. Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.

KW - Gesture

KW - Group coordination

KW - Pedestrian

KW - Social robot

UR - http://www.scopus.com/inward/record.url?scp=85071483996&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85071483996&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-26118-4_8

DO - 10.1007/978-3-030-26118-4_8

M3 - Conference contribution

AN - SCOPUS:85071483996

SN - 9783030261177

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 75

EP - 85

BT - Interactive Collaborative Robotics - 4th International Conference, ICR 2019, Proceedings

A2 - Rigoll, Gerhard

A2 - Ronzhin, Andrey

A2 - Meshcheryakov, Roman

PB - Springer Verlag

ER -