In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.