TY - JOUR T1 - Crossmodal integration improves sensory detection thresholds in the ferret JF - bioRxiv DO - 10.1101/014407 SP - 014407 AU - Karl J. Hollensteiner AU - Florian Pieper AU - Gerhard Engler AU - Peter König AU - Andreas K. Engel Y1 - 2015/01/01 UR - http://biorxiv.org/content/early/2015/01/26/014407.abstract N2 - During the last decade ferrets (Mustela putorius) were established as a highly efficient animal model in different fields in neuroscience. Here we asked whether ferrets merge sensory information according to the same principles established for other species. Since there are only a few methods and protocols for behaving ferrets we developed a head-free, body-restrained approach allowing a standardized stimulation position and the utilization of the ferret’s natural response behaviour. We established a behavioural paradigm to test audiovisual integration in the ferret. Animals had to detect a brief auditory and/or visual stimulus presented either left or right from their midline. We first determined detection thresholds for auditory amplitude and visual contrast. In a second step, we combined both modalities and compared psychometric fits and the reaction times between all conditions. We employed Maximum Likelihood Estimation (MLE) to model bimodal psychometric curves and to investigate whether ferrets integrate modalities in an optimal manner. Furthermore, to test for a redundant signal effect we pooled the reaction times of all animals to calculate a race model. We observed that bimodal detection thresholds were reduced and reaction times were faster in the bimodal compared to unimodal conditions. The race model and MLE modeling showed that ferrets integrate modalities in a statistically optimal fashion. Taken together, the data indicate that principles of multisensory integration previously demonstrated in other species also apply to crossmodal processing in the ferret. ER -