Modular Fuzzy-Neuro Controller Driven by Spoken Language Commands

Koliya Pulasinghe, Keigo Watanabe, Kiyotaka Izumi, Kazuo Kiguchi

Research output: Contribution to journalArticle

47 Citations (Scopus)

Abstract

We present a methodology of controlling machines using spoken language commands. The two major problems relating to the speech interfaces for machines, namely, the interpretation of words with fuzzy implications and the out-of-vocabulary (OOV) words in natural conversation, are investigated. The system proposed in this paper is designed to overcome the above two problems in controlling machines using spoken language commands. The present system consists of a hidden Markov model (HMM) based automatic speech recognizer (ASR), with a keyword spotting system to capture the machine sensitive words from the running utterances and a fuzzy-neural network (FNN) based controller to represent the words with fuzzy implications in spoken language commands. Significance of the words, i.e., the contextual meaning of the words according to the machine's current state, is introduced to the system to obtain more realistic output equivalent to users' desire. Modularity of the system is also considered to provide a generalization of the methodology for systems having heterogeneous functions without diminishing the performance of the system. The proposed system is experimentally tested by navigating a mobile robot in real time using spoken language commands.

Original languageEnglish
Pages (from-to)293-302
Number of pages10
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Volume34
Issue number1
DOIs
Publication statusPublished - Feb 2004
Externally publishedYes

Fingerprint

Controllers
Fuzzy neural networks
Hidden Markov models
Mobile robots

Keywords

  • Fuzzy-neural network
  • Hidden Markov models
  • Robot control
  • Speech interfaces
  • Spoken language commands

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence
  • Human-Computer Interaction

Cite this

Modular Fuzzy-Neuro Controller Driven by Spoken Language Commands. / Pulasinghe, Koliya; Watanabe, Keigo; Izumi, Kiyotaka; Kiguchi, Kazuo.

In: IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, Vol. 34, No. 1, 02.2004, p. 293-302.

Research output: Contribution to journalArticle

@article{a9a43a33f53b45c5aedbbb495f90ccdc,
title = "Modular Fuzzy-Neuro Controller Driven by Spoken Language Commands",
abstract = "We present a methodology of controlling machines using spoken language commands. The two major problems relating to the speech interfaces for machines, namely, the interpretation of words with fuzzy implications and the out-of-vocabulary (OOV) words in natural conversation, are investigated. The system proposed in this paper is designed to overcome the above two problems in controlling machines using spoken language commands. The present system consists of a hidden Markov model (HMM) based automatic speech recognizer (ASR), with a keyword spotting system to capture the machine sensitive words from the running utterances and a fuzzy-neural network (FNN) based controller to represent the words with fuzzy implications in spoken language commands. Significance of the words, i.e., the contextual meaning of the words according to the machine's current state, is introduced to the system to obtain more realistic output equivalent to users' desire. Modularity of the system is also considered to provide a generalization of the methodology for systems having heterogeneous functions without diminishing the performance of the system. The proposed system is experimentally tested by navigating a mobile robot in real time using spoken language commands.",
keywords = "Fuzzy-neural network, Hidden Markov models, Robot control, Speech interfaces, Spoken language commands",
author = "Koliya Pulasinghe and Keigo Watanabe and Kiyotaka Izumi and Kazuo Kiguchi",
year = "2004",
month = "2",
doi = "10.1109/TSMCB.2003.811511",
language = "English",
volume = "34",
pages = "293--302",
journal = "IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics",
issn = "1083-4419",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "1",

}

TY - JOUR

T1 - Modular Fuzzy-Neuro Controller Driven by Spoken Language Commands

AU - Pulasinghe, Koliya

AU - Watanabe, Keigo

AU - Izumi, Kiyotaka

AU - Kiguchi, Kazuo

PY - 2004/2

Y1 - 2004/2

N2 - We present a methodology of controlling machines using spoken language commands. The two major problems relating to the speech interfaces for machines, namely, the interpretation of words with fuzzy implications and the out-of-vocabulary (OOV) words in natural conversation, are investigated. The system proposed in this paper is designed to overcome the above two problems in controlling machines using spoken language commands. The present system consists of a hidden Markov model (HMM) based automatic speech recognizer (ASR), with a keyword spotting system to capture the machine sensitive words from the running utterances and a fuzzy-neural network (FNN) based controller to represent the words with fuzzy implications in spoken language commands. Significance of the words, i.e., the contextual meaning of the words according to the machine's current state, is introduced to the system to obtain more realistic output equivalent to users' desire. Modularity of the system is also considered to provide a generalization of the methodology for systems having heterogeneous functions without diminishing the performance of the system. The proposed system is experimentally tested by navigating a mobile robot in real time using spoken language commands.

AB - We present a methodology of controlling machines using spoken language commands. The two major problems relating to the speech interfaces for machines, namely, the interpretation of words with fuzzy implications and the out-of-vocabulary (OOV) words in natural conversation, are investigated. The system proposed in this paper is designed to overcome the above two problems in controlling machines using spoken language commands. The present system consists of a hidden Markov model (HMM) based automatic speech recognizer (ASR), with a keyword spotting system to capture the machine sensitive words from the running utterances and a fuzzy-neural network (FNN) based controller to represent the words with fuzzy implications in spoken language commands. Significance of the words, i.e., the contextual meaning of the words according to the machine's current state, is introduced to the system to obtain more realistic output equivalent to users' desire. Modularity of the system is also considered to provide a generalization of the methodology for systems having heterogeneous functions without diminishing the performance of the system. The proposed system is experimentally tested by navigating a mobile robot in real time using spoken language commands.

KW - Fuzzy-neural network

KW - Hidden Markov models

KW - Robot control

KW - Speech interfaces

KW - Spoken language commands

UR - http://www.scopus.com/inward/record.url?scp=0842290820&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0842290820&partnerID=8YFLogxK

U2 - 10.1109/TSMCB.2003.811511

DO - 10.1109/TSMCB.2003.811511

M3 - Article

VL - 34

SP - 293

EP - 302

JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics

JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics

SN - 1083-4419

IS - 1

ER -