Walk the Talk: Gestures in Mobile Interaction

Zeynep Yucel, Francesco Zanlungo, Masahiro Shiomi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This study aims at describing navigation guidelines and concerning analytic motion models for a mobile interaction robot, which moves together with a human partner. We address particularly the impact of gestures on the coupled motion of this human-robot pair. We pose that the robot needs to adjust its navigation in accordance to its gestures in a natural manner (mimicking human-human locomotion). In order to justify this suggestion, we first examine the motion patterns of real-world pedestrian dyads in accordance to 4 affective components of interaction (i.e. gestures). Three benchmark variables are derived from pedestrian trajectories and their behavior is investigated with respect to three conditions: (i) presence/absence of isolated gestures, (ii) varying number of simultaneously performed (i.e. concurring) gestures, (iii) varying size of the environment. It is observed empirically and proven quantitatively that there is a significant difference in the benchmark variables between presence and absence of the gestures, whereas no prominent variation exists in regard to the type of gesture or the number of concurring gestures. Moreover, size of the environment is shown to be a crucial factor in sustainability of the group structure. Subsequently, we propose analytic models to represent these behavioral variations and prove that our models attain significant accuracy in reflecting the distinctions. Finally, we propose an implementation scheme for integrating the analytic models to practical applications. Our results bear the potential of serving as navigation guidelines for the robot so as to provide a more natural interaction experience for the human counterpart of a robot-pedestrian group on-the-move.

Original languageEnglish
Title of host publicationSocial Robotics - 9th International Conference, ICSR 2017, Proceedings
PublisherSpringer Verlag
Pages220-230
Number of pages11
Volume10652 LNAI
ISBN (Print)9783319700212
DOIs
Publication statusPublished - Jan 1 2017
Event9th International Conference on Social Robotics, ICSR 2017 - Tsukuba, Japan
Duration: Nov 22 2017Nov 24 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10652 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other9th International Conference on Social Robotics, ICSR 2017
CountryJapan
CityTsukuba
Period11/22/1711/24/17

Fingerprint

Gesture
Walk
Robots
Interaction
Navigation
Robot
Motion
Benchmark
Sustainable development
Trajectories
Locomotion
Sustainability
Model
Justify
Human
Trajectory

Keywords

  • Affective communication
  • Gestures

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Yucel, Z., Zanlungo, F., & Shiomi, M. (2017). Walk the Talk: Gestures in Mobile Interaction. In Social Robotics - 9th International Conference, ICSR 2017, Proceedings (Vol. 10652 LNAI, pp. 220-230). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10652 LNAI). Springer Verlag. https://doi.org/10.1007/978-3-319-70022-9_22

Walk the Talk : Gestures in Mobile Interaction. / Yucel, Zeynep; Zanlungo, Francesco; Shiomi, Masahiro.

Social Robotics - 9th International Conference, ICSR 2017, Proceedings. Vol. 10652 LNAI Springer Verlag, 2017. p. 220-230 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10652 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yucel, Z, Zanlungo, F & Shiomi, M 2017, Walk the Talk: Gestures in Mobile Interaction. in Social Robotics - 9th International Conference, ICSR 2017, Proceedings. vol. 10652 LNAI, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10652 LNAI, Springer Verlag, pp. 220-230, 9th International Conference on Social Robotics, ICSR 2017, Tsukuba, Japan, 11/22/17. https://doi.org/10.1007/978-3-319-70022-9_22
Yucel Z, Zanlungo F, Shiomi M. Walk the Talk: Gestures in Mobile Interaction. In Social Robotics - 9th International Conference, ICSR 2017, Proceedings. Vol. 10652 LNAI. Springer Verlag. 2017. p. 220-230. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-70022-9_22
Yucel, Zeynep ; Zanlungo, Francesco ; Shiomi, Masahiro. / Walk the Talk : Gestures in Mobile Interaction. Social Robotics - 9th International Conference, ICSR 2017, Proceedings. Vol. 10652 LNAI Springer Verlag, 2017. pp. 220-230 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{383ba890497442ceb7fb9b677522d2fa,
title = "Walk the Talk: Gestures in Mobile Interaction",
abstract = "This study aims at describing navigation guidelines and concerning analytic motion models for a mobile interaction robot, which moves together with a human partner. We address particularly the impact of gestures on the coupled motion of this human-robot pair. We pose that the robot needs to adjust its navigation in accordance to its gestures in a natural manner (mimicking human-human locomotion). In order to justify this suggestion, we first examine the motion patterns of real-world pedestrian dyads in accordance to 4 affective components of interaction (i.e. gestures). Three benchmark variables are derived from pedestrian trajectories and their behavior is investigated with respect to three conditions: (i) presence/absence of isolated gestures, (ii) varying number of simultaneously performed (i.e. concurring) gestures, (iii) varying size of the environment. It is observed empirically and proven quantitatively that there is a significant difference in the benchmark variables between presence and absence of the gestures, whereas no prominent variation exists in regard to the type of gesture or the number of concurring gestures. Moreover, size of the environment is shown to be a crucial factor in sustainability of the group structure. Subsequently, we propose analytic models to represent these behavioral variations and prove that our models attain significant accuracy in reflecting the distinctions. Finally, we propose an implementation scheme for integrating the analytic models to practical applications. Our results bear the potential of serving as navigation guidelines for the robot so as to provide a more natural interaction experience for the human counterpart of a robot-pedestrian group on-the-move.",
keywords = "Affective communication, Gestures",
author = "Zeynep Yucel and Francesco Zanlungo and Masahiro Shiomi",
year = "2017",
month = "1",
day = "1",
doi = "10.1007/978-3-319-70022-9_22",
language = "English",
isbn = "9783319700212",
volume = "10652 LNAI",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "220--230",
booktitle = "Social Robotics - 9th International Conference, ICSR 2017, Proceedings",

}

TY - GEN

T1 - Walk the Talk

T2 - Gestures in Mobile Interaction

AU - Yucel, Zeynep

AU - Zanlungo, Francesco

AU - Shiomi, Masahiro

PY - 2017/1/1

Y1 - 2017/1/1

N2 - This study aims at describing navigation guidelines and concerning analytic motion models for a mobile interaction robot, which moves together with a human partner. We address particularly the impact of gestures on the coupled motion of this human-robot pair. We pose that the robot needs to adjust its navigation in accordance to its gestures in a natural manner (mimicking human-human locomotion). In order to justify this suggestion, we first examine the motion patterns of real-world pedestrian dyads in accordance to 4 affective components of interaction (i.e. gestures). Three benchmark variables are derived from pedestrian trajectories and their behavior is investigated with respect to three conditions: (i) presence/absence of isolated gestures, (ii) varying number of simultaneously performed (i.e. concurring) gestures, (iii) varying size of the environment. It is observed empirically and proven quantitatively that there is a significant difference in the benchmark variables between presence and absence of the gestures, whereas no prominent variation exists in regard to the type of gesture or the number of concurring gestures. Moreover, size of the environment is shown to be a crucial factor in sustainability of the group structure. Subsequently, we propose analytic models to represent these behavioral variations and prove that our models attain significant accuracy in reflecting the distinctions. Finally, we propose an implementation scheme for integrating the analytic models to practical applications. Our results bear the potential of serving as navigation guidelines for the robot so as to provide a more natural interaction experience for the human counterpart of a robot-pedestrian group on-the-move.

AB - This study aims at describing navigation guidelines and concerning analytic motion models for a mobile interaction robot, which moves together with a human partner. We address particularly the impact of gestures on the coupled motion of this human-robot pair. We pose that the robot needs to adjust its navigation in accordance to its gestures in a natural manner (mimicking human-human locomotion). In order to justify this suggestion, we first examine the motion patterns of real-world pedestrian dyads in accordance to 4 affective components of interaction (i.e. gestures). Three benchmark variables are derived from pedestrian trajectories and their behavior is investigated with respect to three conditions: (i) presence/absence of isolated gestures, (ii) varying number of simultaneously performed (i.e. concurring) gestures, (iii) varying size of the environment. It is observed empirically and proven quantitatively that there is a significant difference in the benchmark variables between presence and absence of the gestures, whereas no prominent variation exists in regard to the type of gesture or the number of concurring gestures. Moreover, size of the environment is shown to be a crucial factor in sustainability of the group structure. Subsequently, we propose analytic models to represent these behavioral variations and prove that our models attain significant accuracy in reflecting the distinctions. Finally, we propose an implementation scheme for integrating the analytic models to practical applications. Our results bear the potential of serving as navigation guidelines for the robot so as to provide a more natural interaction experience for the human counterpart of a robot-pedestrian group on-the-move.

KW - Affective communication

KW - Gestures

UR - http://www.scopus.com/inward/record.url?scp=85035746626&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85035746626&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-70022-9_22

DO - 10.1007/978-3-319-70022-9_22

M3 - Conference contribution

AN - SCOPUS:85035746626

SN - 9783319700212

VL - 10652 LNAI

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 220

EP - 230

BT - Social Robotics - 9th International Conference, ICSR 2017, Proceedings

PB - Springer Verlag

ER -