Enhancement of visual detection by temporal alignment of visual-auditory stimuli: A behavioral and event-related potential study

Jingjing Yang, Qi Li, Weiping Yang, Jinglong Wu

Research output: Contribution to journalArticle

Abstract

Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In this study, the visual and auditory stimuli onset asynchrony (SOA= ±400 ms, ±150 ms, 0 ms), only the visual stimulus was attended. Behavioral data and Event-Related Potentials (ERPs) were recorded. We found that when the SOA was -150ms, the reaction time was the fastest and hit rate was the highest, and the N1 latency of biomodal AV was earlier than the sum of unimodal A and unimodal V in auditory preceding condition.

Original languageEnglish
Pages (from-to)527-534
Number of pages8
JournalInformation (Japan)
Volume16
Issue number1 A
Publication statusPublished - Jan 2013

Keywords

  • Attention
  • Audio-visual integration
  • Event-Related potential (erp)
  • Temporal alignment

ASJC Scopus subject areas

  • Information Systems

Fingerprint Dive into the research topics of 'Enhancement of visual detection by temporal alignment of visual-auditory stimuli: A behavioral and event-related potential study'. Together they form a unique fingerprint.

  • Cite this