Understanding user commands by evaluating fuzzy linguistic information based on visual attention

Buddhika Jayasekara, Keigo Watanabe, Kiyotaka Izumi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

This paper proposes a method for understanding the user commands based on visual attention. Visual attention system is implemented to evaluate the fuzzy linguistic information based on the environmental conditions. It is assumed that the corresponding distance value for a particular fuzzy linguistic command depends on the spatial arrangement of the surrounding objects. A fuzzy logic based voice command evaluation system (VCES) is proposed to assess the uncertain information in user commands. A situation of object manipulation for rearranging the users working space is simulated to illustrate the system. It is demonstrated with PA-10 robot manipulator.

Original languageEnglish
Title of host publicationProceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09
Pages720-723
Number of pages4
Publication statusPublished - 2009
Externally publishedYes
Event14th International Symposium on Artificial Life and Robotics, AROB 14th'09 - Oita, Japan
Duration: Feb 5 2008Feb 7 2009

Other

Other14th International Symposium on Artificial Life and Robotics, AROB 14th'09
CountryJapan
CityOita
Period2/5/082/7/09

Fingerprint

Linguistics
Fuzzy logic
Manipulators
Robots

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Cite this

Jayasekara, B., Watanabe, K., & Izumi, K. (2009). Understanding user commands by evaluating fuzzy linguistic information based on visual attention. In Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09 (pp. 720-723)

Understanding user commands by evaluating fuzzy linguistic information based on visual attention. / Jayasekara, Buddhika; Watanabe, Keigo; Izumi, Kiyotaka.

Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09. 2009. p. 720-723.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jayasekara, B, Watanabe, K & Izumi, K 2009, Understanding user commands by evaluating fuzzy linguistic information based on visual attention. in Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09. pp. 720-723, 14th International Symposium on Artificial Life and Robotics, AROB 14th'09, Oita, Japan, 2/5/08.
Jayasekara B, Watanabe K, Izumi K. Understanding user commands by evaluating fuzzy linguistic information based on visual attention. In Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09. 2009. p. 720-723
Jayasekara, Buddhika ; Watanabe, Keigo ; Izumi, Kiyotaka. / Understanding user commands by evaluating fuzzy linguistic information based on visual attention. Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09. 2009. pp. 720-723
@inproceedings{0d5ed59d812843dba66db8c3b209b4cd,
title = "Understanding user commands by evaluating fuzzy linguistic information based on visual attention",
abstract = "This paper proposes a method for understanding the user commands based on visual attention. Visual attention system is implemented to evaluate the fuzzy linguistic information based on the environmental conditions. It is assumed that the corresponding distance value for a particular fuzzy linguistic command depends on the spatial arrangement of the surrounding objects. A fuzzy logic based voice command evaluation system (VCES) is proposed to assess the uncertain information in user commands. A situation of object manipulation for rearranging the users working space is simulated to illustrate the system. It is demonstrated with PA-10 robot manipulator.",
author = "Buddhika Jayasekara and Keigo Watanabe and Kiyotaka Izumi",
year = "2009",
language = "English",
isbn = "9784990288037",
pages = "720--723",
booktitle = "Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09",

}

TY - GEN

T1 - Understanding user commands by evaluating fuzzy linguistic information based on visual attention

AU - Jayasekara, Buddhika

AU - Watanabe, Keigo

AU - Izumi, Kiyotaka

PY - 2009

Y1 - 2009

N2 - This paper proposes a method for understanding the user commands based on visual attention. Visual attention system is implemented to evaluate the fuzzy linguistic information based on the environmental conditions. It is assumed that the corresponding distance value for a particular fuzzy linguistic command depends on the spatial arrangement of the surrounding objects. A fuzzy logic based voice command evaluation system (VCES) is proposed to assess the uncertain information in user commands. A situation of object manipulation for rearranging the users working space is simulated to illustrate the system. It is demonstrated with PA-10 robot manipulator.

AB - This paper proposes a method for understanding the user commands based on visual attention. Visual attention system is implemented to evaluate the fuzzy linguistic information based on the environmental conditions. It is assumed that the corresponding distance value for a particular fuzzy linguistic command depends on the spatial arrangement of the surrounding objects. A fuzzy logic based voice command evaluation system (VCES) is proposed to assess the uncertain information in user commands. A situation of object manipulation for rearranging the users working space is simulated to illustrate the system. It is demonstrated with PA-10 robot manipulator.

UR - http://www.scopus.com/inward/record.url?scp=78149316909&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78149316909&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:78149316909

SN - 9784990288037

SP - 720

EP - 723

BT - Proceedings of the 14th International Symposium on Artificial Life and Robotics, AROB 14th'09

ER -