This is the repo for ...
Lu, Q., Chen, P.H., Pillow, J. W., Ramadge, P. J., Norman, K. A., & Hasson, U. (2018).
Shared Representational Geometry Across Neural Networks. arXiv [cs.LG].
Retrieved from http://arxiv.org/abs/1811.11684
1 sentence summary: different neural networks with the same learning experience acquire representations of the same "shape"
Here's a figure showing the activity trajectories from 5 resnets, when they are viewing the same sequence of images.
- Fig left: before alignment; activity trajectories in native spaces
- Fig right: after alignment; it is clear that the geometry of their representations are highly similar
- Here're more animations.
run_sim.ipynb
: run the simulation described in the paperdata_gen.py
: make toy data set to train NNsmodels.py
: define a simple neural network
*The notebooks are not runnable yet, since they depend on some pre-computed data. I'm working on an easy way of hosting the data publicly. Though re-running the whole analysis should be possible.
show_*.ipynb
: load some pre-computed data (e.g. activity from some pre-trained neural networks), apply certain analyses (e.g. SRM), then plot the resultstrain_*.py
: train some models (e.g. conv nets) on some dataset (e.g. cifar10)save_acts_cifar.py
: test and save neural network activity matricesrun_analyses.py
: run SRM, RSA, etc.models.py
: some models (e.g. conv nets)resnet.py
: resnets from raghakot/keras-resnet [2]config.py
: define some constants, such as how to re-arrange the ordering of the images in cifardata_loader.py
: util for loading data
qmvpa
: contains some analyses util functions
[1] philipperemy/keract
[2] raghakot/keras-resnet
[3] hypertools
[4] BrainIAK