Auralizations have become more prevalent in architectural acoustics. Auralizations in listening tests are typically presented in a uni-modal fashion (audio only). However, in everyday-life one perceives complex multi-modal information. Multi-sensory… Click to show full abstract
Auralizations have become more prevalent in architectural acoustics. Auralizations in listening tests are typically presented in a uni-modal fashion (audio only). However, in everyday-life one perceives complex multi-modal information. Multi-sensory research has shown that visuals can influence auditory perceptions, such as with the McGurk and ventriloquist effects. However, few studies have investigated the influence of visuals on room acoustic perception. Additionally, in the majority of previous studies, visual cues were represented by photographs either with or without visuals of the source. Previously, a virtual reality framework combining a visible animated source in a virtual room with auralizations was conceived enabling multi-modal assessments. The framework is based on BlenderVR scene graph and visual rendering with MaxMSP for the real-time audio rendering of 3rd order HOA room impulse responses (RIRs) in tracked binaural. CATT-Acoustic TUCT was used to generate the HOA RIRs. Using this framewor...
               
Click one of the above tabs to view related content.