Attention allocation for multi-modal perception of human-friendly robot partners

Yuichiro Toda, Naoyuki Kubota

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This paper proposes a method of attention allocation for multi-modal perception of humanfriendly robot partners based on various types of sensors built in a smart phone. First, we propose human and object detection method using octagonal templates based on evolutionary robot vision. Next, we propose an integration method for estimating human behaviors based on the human detection using color image by the multi-layered spiking neural network using the time series of positions of human and object. Furthermore, we propose a method of attention allocation based on the time series of human behavior recognition. Finally, we show several experimental results of the proposed method, and discuss the future direction on this research.

Original languageEnglish
Title of host publication12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings
Pages324-329
Number of pages6
Volume12
EditionPART 1
DOIs
Publication statusPublished - Oct 22 2013
Externally publishedYes
Event12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Las Vegas, NV, United States
Duration: Aug 11 2013Aug 14 2013

Other

Other12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013
CountryUnited States
CityLas Vegas, NV
Period8/11/138/14/13

Fingerprint

Time series
Robots
Computer vision
Color
Neural networks
Sensors
Object detection

Keywords

  • Computational Intelligence
  • Human Robot Interaction
  • Multi-modal Perception
  • Robot Partners
  • Robot Vision

ASJC Scopus subject areas

  • Control and Systems Engineering

Cite this

Toda, Y., & Kubota, N. (2013). Attention allocation for multi-modal perception of human-friendly robot partners. In 12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings (PART 1 ed., Vol. 12, pp. 324-329) https://doi.org/10.3182/20130811-5-US-2037.00054

Attention allocation for multi-modal perception of human-friendly robot partners. / Toda, Yuichiro; Kubota, Naoyuki.

12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings. Vol. 12 PART 1. ed. 2013. p. 324-329.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Toda, Y & Kubota, N 2013, Attention allocation for multi-modal perception of human-friendly robot partners. in 12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings. PART 1 edn, vol. 12, pp. 324-329, 12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013, Las Vegas, NV, United States, 8/11/13. https://doi.org/10.3182/20130811-5-US-2037.00054
Toda Y, Kubota N. Attention allocation for multi-modal perception of human-friendly robot partners. In 12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings. PART 1 ed. Vol. 12. 2013. p. 324-329 https://doi.org/10.3182/20130811-5-US-2037.00054
Toda, Yuichiro ; Kubota, Naoyuki. / Attention allocation for multi-modal perception of human-friendly robot partners. 12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings. Vol. 12 PART 1. ed. 2013. pp. 324-329
@inproceedings{a248e0a518ad4df89959bcb8f34f3ac8,
title = "Attention allocation for multi-modal perception of human-friendly robot partners",
abstract = "This paper proposes a method of attention allocation for multi-modal perception of humanfriendly robot partners based on various types of sensors built in a smart phone. First, we propose human and object detection method using octagonal templates based on evolutionary robot vision. Next, we propose an integration method for estimating human behaviors based on the human detection using color image by the multi-layered spiking neural network using the time series of positions of human and object. Furthermore, we propose a method of attention allocation based on the time series of human behavior recognition. Finally, we show several experimental results of the proposed method, and discuss the future direction on this research.",
keywords = "Computational Intelligence, Human Robot Interaction, Multi-modal Perception, Robot Partners, Robot Vision",
author = "Yuichiro Toda and Naoyuki Kubota",
year = "2013",
month = "10",
day = "22",
doi = "10.3182/20130811-5-US-2037.00054",
language = "English",
isbn = "9783902823410",
volume = "12",
pages = "324--329",
booktitle = "12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings",
edition = "PART 1",

}

TY - GEN

T1 - Attention allocation for multi-modal perception of human-friendly robot partners

AU - Toda, Yuichiro

AU - Kubota, Naoyuki

PY - 2013/10/22

Y1 - 2013/10/22

N2 - This paper proposes a method of attention allocation for multi-modal perception of humanfriendly robot partners based on various types of sensors built in a smart phone. First, we propose human and object detection method using octagonal templates based on evolutionary robot vision. Next, we propose an integration method for estimating human behaviors based on the human detection using color image by the multi-layered spiking neural network using the time series of positions of human and object. Furthermore, we propose a method of attention allocation based on the time series of human behavior recognition. Finally, we show several experimental results of the proposed method, and discuss the future direction on this research.

AB - This paper proposes a method of attention allocation for multi-modal perception of humanfriendly robot partners based on various types of sensors built in a smart phone. First, we propose human and object detection method using octagonal templates based on evolutionary robot vision. Next, we propose an integration method for estimating human behaviors based on the human detection using color image by the multi-layered spiking neural network using the time series of positions of human and object. Furthermore, we propose a method of attention allocation based on the time series of human behavior recognition. Finally, we show several experimental results of the proposed method, and discuss the future direction on this research.

KW - Computational Intelligence

KW - Human Robot Interaction

KW - Multi-modal Perception

KW - Robot Partners

KW - Robot Vision

UR - http://www.scopus.com/inward/record.url?scp=84885791634&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84885791634&partnerID=8YFLogxK

U2 - 10.3182/20130811-5-US-2037.00054

DO - 10.3182/20130811-5-US-2037.00054

M3 - Conference contribution

SN - 9783902823410

VL - 12

SP - 324

EP - 329

BT - 12th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human - Machine Systems, HMS 2013 - Proceedings

ER -