Technical Program

Paper Detail

Paper: PS-2A.18
Session: Poster Session 2A
Location: Symphony/Overture
Session Time: Friday, September 7, 17:15 - 19:15
Presentation Time:Friday, September 7, 17:15 - 19:15
Presentation: Poster
Publication: 2018 Conference on Cognitive Computational Neuroscience, 5-8 September 2018, Philadelphia, Pennsylvania
Paper Title: Intuitive Physical Inference from Sound
Manuscript:  Click here to view manuscript
Authors: James Traer, Josh McDermott, Massachusetts Inst. of Technology, United States
Abstract: When objects collide they produce sound. Just by listening, humans can estimate object material, size, and impact force. The physics of how these parameters interact to generate sound is well established, yet the inverse problem faced by listeners – inferring physical parameters from sound – remains poorly understood. One challenge is that two objects create a sound, leaving the inference of any single object’s properties ill-posed. Here we show that judgments of physical variables exhibit the phenomenon of explaining away, as might be expected if listeners perform inference in a generative model of acoustical physics. We presented listeners with recordings of a ball hitting a board. Listeners could identify the heavier of two balls, even when the two balls were dropped on boards differing in material – an ill-posed inference. To test whether listeners were implicitly estimating and discounting the board material, we altered the decay rates of the boards to imply a harder or softer material. These alterations affected mass comparisons despite not being directly informative about mass, suggesting that listeners factorize the acoustic contributions of the two objects. The results indicate that humans have internalized a generative model of impact sounds and use it to perform intuitive physical inference.