Abstract
In this paper, voice commands based on natural spoken language are used so that two or more same colored objects are tracked by a robot manipulator. More precisely, after receiving a voice command regarding the color information, the end-effector of the robot is controlled to approach a desired object out of many objects by visual feedback, in which the visual information is further applied to the human if a more correct motion of the robot will be required. Thus, the present objective is to obtain a smoother cooperative system between human and robot, by coordinating voice and visual information.
Original language | English |
---|---|
Pages | 1779-1783 |
Number of pages | 5 |
DOIs | |
Publication status | Published - Dec 1 2004 |
Externally published | Yes |
Event | IECON 2004 - 30th Annual Conference of IEEE Industrial Electronics Society - Busan, Korea, Republic of Duration: Nov 2 2004 → Nov 6 2004 |
Other
Other | IECON 2004 - 30th Annual Conference of IEEE Industrial Electronics Society |
---|---|
Country/Territory | Korea, Republic of |
City | Busan |
Period | 11/2/04 → 11/6/04 |
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering