Brain Comp Infra: Workshop: Research Methods and Infrastructure for Scalable Computing in Neuroscience

2016 - presentpresent

The scientific endeavor to understand the brain is fraught with apparently intractable computational problems. Neural processes involve billions of processing units (i.e., neurons) interacting over various ranges of space and time, and ideal models for describing neuroscientific data often outstrip the limits of infrastructure commonly accessible to neuroscientists. However, neuroscientists are often unaware of the most scalable, available computing approaches and matched infrastructure that may remove barriers to discovery. For example, methods that examine individual brain images (or sub-components) or that repeat a calculation over a parameter space are likely to scale well on optimized for high-throughput computing (HTC) infrastructure while being severely limited on desktop and high-performance computing (HPC) configurations that are most commonly available to neuroscientists. Furthermore, neuroscientists working at this frontier typically develop lab-specific solutions that are poorly documented, leading to a balkanization of knowledge and methods with little ability to replicate or compare results across labs. To address these challenges, we are presenting a pair of workshops to establish best practices for selecting and implementing scalability approaches across prominent neuroscience research methods, and to disseminate information about new methods that optimize these resources.