Multimodal communication for human-friendly robot partners in informationally structured space

Naoyuki Kubota, Yuichiro Toda

Research output: Contribution to journalArticlepeer-review

20 Citations (Scopus)


This paper proposes a multimodal communication method for human-friendly robot partners based on various types of sensors. First, we explain informationally structured space to extend the cognitive capabilities of robot partners based on environmental systems. Next, we discuss the suitable measurement range for recognition technologies of touch interface, voice recognition, human detection, gesture recognition, and others. Based on the suitable measurement ranges, we propose an integration method to estimate human behaviors based on the human detection using color image and 3-D distance information, and gesture recognition by the multilayered spiking neural network using the time series of human-hand positions. Furthermore, we propose a conversation system to realize the multimodal communication with a person. Finally, we show several experimental results of the proposed method, and discuss the future direction of this research.

Original languageEnglish
Article number6392476
Pages (from-to)1142-1151
Number of pages10
JournalIEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews
Issue number6
Publication statusPublished - 2012
Externally publishedYes


  • Computational intelligence
  • human-robot interaction
  • information services
  • intelligent robots
  • ubiquitous computing

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering


Dive into the research topics of 'Multimodal communication for human-friendly robot partners in informationally structured space'. Together they form a unique fingerprint.

Cite this