Dataset for: Separable neurocomputational mechanisms underlying multisensory learning Authors: Saurabh Bedi, Ella Casimiro, Gilles de Hollander, Nina Raduner, Fritjof Helmchen, Silvia Brem, Arkady Konovalov, Christian C. Ruff DOI: 10.1101/2025.11.18.688925
This dataset contains neuroimaging and behavioral data from a study investigating the distinct but interacting neurocomputational mechanisms that support learning of multisensory associations. The study addresses how the brain efficiently controls behavior by integrating information across multiple senses, moving beyond the traditional focus on unisensory signals.
- Behavioral data: Task performance metrics, response times, and trial-level details.
- Neuroimaging data: fMRI scans (BOLD signals) acquired during multisensory learning tasks.
- Stimuli: Audiovisual stimuli used in the experimental paradigm.
- Metadata: Participant demographics, experimental conditions, and task parameters.
This dataset is intended for researchers interested in multisensory integration, computational neuroscience, and learning mechanisms. It is formatted according to the BIDS standard for easy integration with analysis pipelines.
If you use this dataset, please cite the original paper:
Bedi, S., Casimiro, E., de Hollander, G., Raduner, N., Helmchen, F., Brem, S., Konovalov, A., Ruff, C. C. (2025). Separable neurocomputational mechanisms underlying multisensory learning. bioRxiv, 2025.11.18.688925. https://doi.org/10.1101/2025.11.18.688925
This dataset is shared under a CC-BY 4.0 license.
For questions or collaborations, please contact: Gilles de Hollander