ReWaRD: Retinal Waves for Pre-Training Artificial Neural Networks Mimicking Real Prenatal Development (2311.17232v1)
Abstract: Computational models trained on a large amount of natural images are the state-of-the-art to study human vision - usually adult vision. Computational models of infant vision and its further development are gaining more and more attention in the community. In this work we aim at the very beginning of our visual experience - pre- and post-natal retinal waves which suggest to be a pre-training mechanism for the primate visual system at a very early stage of development. We see this approach as an instance of biologically plausible data driven inductive bias through pre-training. We built a computational model that mimics this development mechanism by pre-training different artificial convolutional neural networks with simulated retinal wave images. The resulting features of this biologically plausible pre-training closely match the V1 features of the primate visual system. We show that the performance gain by pre-training with retinal waves is similar to a state-of-the art pre-training pipeline. Our framework contains the retinal wave generator, as well as a training strategy, which can be a first step in a curriculum learning based training diet for various models of development. We release code, data and trained networks to build the basis for future work on visual development and based on a curriculum learning approach including prenatal development to support studies of innate vs. learned properties of the primate visual system. An additional benefit of our pre-trained networks for neuroscience or computer vision applications is the absence of biases inherited from datasets like ImageNet.
- Benjamin Cappell. Bennyca/retinal-wave-simulator: Initial release, July 2023.
- Benjamin Cappell. Bennyca/reward: Updated brainscore results, Nov. 2023.
- Benjamin Cappell. rwave-1024 - retinal wave dataset, Apr. 2023.
- Benjamin Cappell. rwave-4096 - retinal wave dataset, Apr. 2023.
- Anns pre-trained on retinal waves, Nov. 2023.
- Slow feature analysis on retinal waves leads to v1 complex cells. PLoS Computational Biology, 10(5):e1003564, May 2014.
- Retinal waves: mechanisms and function in visual system development. Cell Calcium, 37(5):425–432, May 2005.
- Retinal waves prime visual motion detection by simulating future optic flow. Science, 373(6553):eabd0830, 2021.
- Retinal wave behavior through activity-dependent refractory periods. PLoS Computational Biology, 3(11):e245, Nov. 2007.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
- Densely connected convolutional networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 4700–4708, 2017.
- Pre-training without natural images. In Asian Conference on Computer Vision (ACCV), 2020.
- Pre-training without natural images. In International Journal on Computer Vision (IJCV), 2022.
- Spontaneous retinal waves can generate long-range horizontal connectivity in visual cortex. Journal of Neuroscience, 40(34):6584–6599, 2020.
- Learning multiple layers of features from tiny images, 2009.
- Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6):84–90, 2017.
- Geometry reveals an instructive role of retinal waves as biologically plausible pre-training signals. In NeurIPS 2022 Workshop on Symmetry and Geometry in Neural Representations, 2022.
- Synchronous bursts of action potentials in ganglion cells of the developing mammalian retina. Science, 252(5008):939–943, May 1991.
- Alexander Riedel. Bag of tricks for training brain-like deep neural networks. In Brain-Score Workshop, 2022.
- ImageNet Large Scale Visual Recognition Challenge. International Journal of Computer Vision (IJCV), 115(3):211–252, 2015.
- Brain-score: Which artificial neural network for object recognition is most brain-like? bioRxiv preprint, 2018.
- Integrative benchmarking to advance neurally mechanistic models of human intelligence. Neuron, 2020.
- Very deep convolutional networks for large-scale image recognition. 3rd International Conference on Learning Representations (ICLR 2015), pages 1–14, 2015.
- Saycam: A large, longitudinal audiovisual dataset recorded from the infant’s perspective. Open mind, 5:20–29, 2021.
- Aggregated residual transformations for deep neural networks. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1492–1500, 2017.