Mechanisms of visual-auditory temporal processing for artificial intelligence

Jingjing Yang, Qi Li, Xiujun Li, Jinglong Wu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.

Original languageEnglish
Title of host publicationProceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages724-728
Number of pages5
ISBN (Electronic)9781479958382
DOIs
Publication statusPublished - 2014
Event2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014 - Dalian, China
Duration: Oct 14 2014Oct 16 2014

Other

Other2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014
CountryChina
CityDalian
Period10/14/1410/16/14

Fingerprint

Artificial Intelligence
Service oriented architecture (SOA)
Artificial intelligence
Processing
Brain

Keywords

  • Audiovisual integration
  • Multimodal
  • Temporal alignment

ASJC Scopus subject areas

  • Signal Processing
  • Health Information Management
  • Information Systems
  • Biomedical Engineering
  • Health Informatics

Cite this

Yang, J., Li, Q., Li, X., & Wu, J. (2014). Mechanisms of visual-auditory temporal processing for artificial intelligence. In Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014 (pp. 724-728). [7002868] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/BMEI.2014.7002868

Mechanisms of visual-auditory temporal processing for artificial intelligence. / Yang, Jingjing; Li, Qi; Li, Xiujun; Wu, Jinglong.

Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014. Institute of Electrical and Electronics Engineers Inc., 2014. p. 724-728 7002868.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yang, J, Li, Q, Li, X & Wu, J 2014, Mechanisms of visual-auditory temporal processing for artificial intelligence. in Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014., 7002868, Institute of Electrical and Electronics Engineers Inc., pp. 724-728, 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014, Dalian, China, 10/14/14. https://doi.org/10.1109/BMEI.2014.7002868
Yang J, Li Q, Li X, Wu J. Mechanisms of visual-auditory temporal processing for artificial intelligence. In Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014. Institute of Electrical and Electronics Engineers Inc. 2014. p. 724-728. 7002868 https://doi.org/10.1109/BMEI.2014.7002868
Yang, Jingjing ; Li, Qi ; Li, Xiujun ; Wu, Jinglong. / Mechanisms of visual-auditory temporal processing for artificial intelligence. Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014. Institute of Electrical and Electronics Engineers Inc., 2014. pp. 724-728
@inproceedings{a949ec2396c44484be30b53bf5e09cb5,
title = "Mechanisms of visual-auditory temporal processing for artificial intelligence",
abstract = "In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.",
keywords = "Audiovisual integration, Multimodal, Temporal alignment",
author = "Jingjing Yang and Qi Li and Xiujun Li and Jinglong Wu",
year = "2014",
doi = "10.1109/BMEI.2014.7002868",
language = "English",
pages = "724--728",
booktitle = "Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Mechanisms of visual-auditory temporal processing for artificial intelligence

AU - Yang, Jingjing

AU - Li, Qi

AU - Li, Xiujun

AU - Wu, Jinglong

PY - 2014

Y1 - 2014

N2 - In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.

AB - In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.

KW - Audiovisual integration

KW - Multimodal

KW - Temporal alignment

UR - http://www.scopus.com/inward/record.url?scp=84988268372&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988268372&partnerID=8YFLogxK

U2 - 10.1109/BMEI.2014.7002868

DO - 10.1109/BMEI.2014.7002868

M3 - Conference contribution

AN - SCOPUS:84988268372

SP - 724

EP - 728

BT - Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014

PB - Institute of Electrical and Electronics Engineers Inc.

ER -