This repository contains the code and data associated with our preprint "Fusing Multisensory Signals Across Channels and Time" (Anil, Ghosh & Goodman, 2024). The work investigates how temporal structure in multisensory signals can be leveraged in integration strategies, introducing novel computational models for analyzing sensory fusion across multiple timescales.
We extend previous work on multisensory integration by examining how temporal dependencies affect signal processing strategies. The repository includes implementations of:
- Time-dependent detection tasks with variable burst lengths
- Linear and nonlinear fusion algorithms (LF, NLF)
- Sliding window integration models (NLFw)
- Recurrent neural network architectures
- Lévy flight-based signal generation
A complete environment specification is provided in environment.yml
. Key dependencies include:
- NumPy
- PyTorch
- scikit-learn
- Pandas
- Matplotlib
To reproduce the main results from our paper:
- Install dependencies:
conda env create -f environment.yml
- Run experiment scripts in
scripts/
- Generate figures using notebooks in
Plotter/
This project is licensed under the MIT License - see the LICENSE file for details.
For questions about the code or paper, please contact:
- Neural Reckoning Group (https://neural-reckoning.org/)