Mapping the Structure of Thought

Understanding the structure and function of the nervous system is an exceptionally complex task: the system consists of thousands of cells connected to thousands of other cells in microscopic networks that extend over large volumes and exhibit a seemingly endless variety of behaviors. We believe that mapping such networks at the level of synaptic connections, and understanding the relation of their connectivity and geometry to function, will play a key role in unraveling the mystery of thought.

Our group’s goal is to create, based on such microscopic connectivity and functional data, new mathematical models explaining how neural tissue computes. Our modeling spans the connectomics gamut from the behavior of individual neurons in exiguous circuits to collections of neurons in increasingly complex networks. We collaborate with neurobiologists to design experiments based on our theoretical models, and work extensively to analyze the resulting data in order to confirm or disprove our theoretical predictions.


March 27, 2018: Blog on “Deep Learning to Study the Brain to Improve Deep Learning” is Live. January 2017: Shavit Lab’s PPoPP 2017 paper, A Multicore Path to Connectomics-on-Demand is selected for Best Paper Nominee. June 3, 2016, MIT Commencement: Congratulations to new graduates, Gregory Odor and Hayk Saribekyan! (Pictured: Hayk Saribekyan and Professor Nir Shavit.) February 2016: The Shavit Lab has been awarded research funding under the IARPA Machine Intelligence from Cortical Networks (MICrONS) project. October 2015: Brain-like chip.


High Throughput Connectomics The current design trend in large scale machine learning is to use distributed clusters of CPUs and GPUs with MapReduce-style programming. Some have been led to believe that this type of horizontal scaling can reduce or even eliminate the need for traditional algorithm development, careful parallelization, and performance engineering. This paper is a case study showing the contrary: that the benefits of algorithms, parallelization, and performance engineering, can sometimes be so vast that it is possible to solve “clusterscale” problems on a single commodity multicore machine. Connectomics is an emerging area of neurobiology that uses cutting edge machine learning and image processing


Yaron Meirovitch

Graduate Students

Heather Berlin Michael Coulombe Rati Gelashvili Justin Kopinsky Shibani Santurkar Hayk Saribekyan David Rolnick


David Budden Jonathan Stoller Gergely Odor Victor Jakubiuk Quan Nguyen Robert Radway


Witvliet, Daniel, Mulcahy, Ben, Mitchell, James K., Meirovitch,  Yaron, Berger, Daniel R., Holmyard, Douglas, Schalek, Richard L., Cook, Steven J., Xian Koh, Wan, Neubauer, Marianna, Rehaluk, Christine, Wang, ZiTong, Kersen, David, Chisholm, Andrew D., Shavit, Nir, Lichtman, Jeffrey W., Samuel, Aravinthan, and Zhen, Mei.  Invariant, stochastic, and developmentally regulated synapses constitute the C. elegans connectome from isogenic individuals.  Poster Presentation at Cosyne 2019. Meirovitch, Yaron, Mi, Lu, Saribekyan, Hayk, Matveev, Alexander, Rolnick, David, Wierzynski, Casimir, and Shavit, Nir. Cross-Classification Clustering: An Efficient Multi-Object Tracking Technique for 3-D Instance Segmentation in Connectomics. CoRR abs/1812.01157, 2018. Santurkar, Shibani, Budden, David M., and Shavit, Nir.  Generative Compression. PCS 2018. 


No resources yet! But check back in soon.