Abstract
This paper proposes a multimodal communication method for human-friendly robot partners based on various types of sensors. First, we explain informationally structured space to extend the cognitive capabilities of robot partners based on environmental systems. Next, we discuss the suitable measurement range for recognition technologies of touch interface, voice recognition, human detection, gesture recognition, and others. Based on the suitable measurement ranges, we propose an integration method to estimate human behaviors based on the human detection using color image and 3-D distance information, and gesture recognition by the multilayered spiking neural network using the time series of human-hand positions. Furthermore, we propose a conversation system to realize the multimodal communication with a person. Finally, we show several experimental results of the proposed method, and discuss the future direction of this research.
Original language | English |
---|---|
Article number | 6392476 |
Pages (from-to) | 1142-1151 |
Number of pages | 10 |
Journal | IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews |
Volume | 42 |
Issue number | 6 |
DOIs | |
Publication status | Published - 2012 |
Externally published | Yes |
Keywords
- Computational intelligence
- human-robot interaction
- information services
- intelligent robots
- ubiquitous computing
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Information Systems
- Human-Computer Interaction
- Computer Science Applications
- Electrical and Electronic Engineering