The components list of the human brain is beyond its own comprehension, containing hundreds of billions of neurons, each with their own complicated set of dynamics and interactions. This complexity calls for the establishment of general principles that inform our understanding of how the brain processes information, makes decisions, and controls movements. Researchers in Computational Neuroscience and Behavior view these potential principles through the lens of concepts from the mathematical, physical, and computational sciences, aiming to use not only the tools developed in these fields, but also to focus on that quantitative framings for scientific inquiries that have long permeated these other areas. Common investigations include measuring the properties and building realistic models of signal neurons and their nearby connections, but also large scale neural recording and imaging, as well as studying cognition and higher-level neural processing and neurophysiological mechanisms underlying them. The eventual goal for many of these studies is to understand how behavior and cognition emerge from these myriad components, a process aided through making quantitative measurements of movements, language, and social interactions. Taken together, these interwoven lines of inquiry create a deeper understanding of the brain and point towards mechanisms and potential treatments for disorders of the nervous system.
A reliable quantitative description of complex dynamics like in living systems is challenged by many peculiarities. A living system is typically multi-physics and multi-scale. For instance, the mathematical description of the human heart requires the combination of solid mechanics, fluid dynamics, and electromagnetism. These models should embrace scales ranging from the cell to the body scales. All these challenges reflect in an intrinsic computational complexity that requires appropriate numerical methods and high performance computing resources. We cover these problems with advanced methods for the solution of nonlinear differential equations, huge linear systems properly preconditioned, adaptive techniques and reduced order modeling on parallel architectures. One distinctive feature of the group is the attention to clinical problems that may benefit from quantitative models. The connection to clinics (in particular to cardiovascular diseases) leads to new problems related to image processing and parameter identification, combined with stochastic approaches for the quantification of uncertainty.
One of the primary challenges biological science faces is how to parse through data of ever-increasing scope and complexity, aiming to discover its underlying structure and generate insight into the answers to fundamental questions. One can think of this challenge as a process of abstraction: taking data that contains many seemingly disparate elements and describing it using only a few numbers that can be then used to gain scientific insight. This type of theory and analysis, typically referred to as Dimensionality Reduction, has long been a hallmark of fields like statistical physics, where the behavior of myriad individual molecules are often described by a few variables like temperature and pressure. Biological systems, however, often consist of components and dynamics of sufficient complexity such that we need to first rely on approaches that find structure directly from the data. These methods, which Emory researchers are applying to a disparate array of biological systems — from the genetic contents of a cell to the firing of neurons to the movements of a limb — rely on ideas from applied mathematics, computer science, and physics. Through discovering these relatively simple patterns in complex data, these studies are building our intuition and generating new models, finding the basis for further theoretical insight across all of the areas of TMLS. Moreover, as animals themselves must learn to parse a wide array of sensory input from their environment, the methods developed could provide insight into how biological systems handle complexity themselves, using essential information from vast streams of data to make decisions and survive.