Research highlights: Professor Wins Top National Science Foundation Award, The College Today

Research Program

My research is in computational neuroscience, an interdisciplinary approach to analyzing, modeling, and understanding cells and neural networks in the brain.

Computational neuroscience is an interdisciplinary science that links neuroscience, cognitive science, and psychology with electrical engineering, computer science, mathematics, and physics. Computational neuroscience is grounded in neurobiology and models biological systems at multiple spatial-temporal scales, from membrane currents protein and chemical coupling to network oscillations, columnar and topographic architecture, and learning and memory. Computational neuroscience focuses on the neurobiology of brain functions and significantly differs from engineering approaches in machine learning, neural networks, and statistical learning theory. In 1985 Eric L. Schwartz used "computational neuroscience" for the first time to summarize work done in neural modeling, brain theory, and neural networks.

One of my research's main themes in computational neuroscience is on deriving mathematical criteria for neural network stability using the phase resetting curve method. There are billions of neurons in our brain with different morphologies and functions expressing different ionic channels. Neurons are connected in neural networks and communicate using electrical and chemical signals. Despite variations, neural networks exist across different species that perform the same function, e.g., the circadian rhythm network.

Neural networks can generate stable, phase-locked rhythms that are essential for biological processes. Synchrony, the best-known example of phase-locked activity in neural networks, is the foundation of complex biological phenomena such as memory, facial recognition, circadian rhythms, and epileptic seizures.

A simplified yet robust modeling approach to phase-locked mode prediction assumes that the individual neurons are characterized only by the amount of delay or advance induced in their firing frequency due to external perturbations, the so-called phase resetting curve (PRC) theory. The stable oscillatory activity of neurons is represented in the space of their state variables, such as the membrane voltage or the ionic conductances, by a closed trajectory called a limit cycle. During one cycle of activity, the figurative point that describes a neuron's state moves along the entire limit cycle with a speed that is always tangent.

I investigated the stability of neural networks to external perturbations and proposed new analytical and computational techniques in this field. There are many reliable PRC predictors for tangent perturbations to the limit cycle, which instantly move a figurative point forward/backward along the steady phase space trajectory, including some developed by our research group. The effect of normal perturbations to the limit cycle, which moves a figurative point outside the unperturbed phase space trajectory, is tough to predict.

I used a unitary transform to the moving reference frame of the figurative point that leaves only two possible directions relative to the unperturbed trajectory for planar limit cycles, i.e., normal or tangent. As a result, there are four possible perturbations and their corresponding effects combinations. I found that a tangent perturbation produces no normal displacement and analytically determined the other three coupling parameters. Our previous results confirm the tangential effect (speeding up or slowing down of neural oscillators) in response to tangent perturbations. Another significant achievement was that the results I found apply to arbitrary stimulus shapes.

The recent results regarding the improved prediction of phase resetting curve based on Floquet exponents in the limit cycle are moving reference frames that allow me to explore high-dimensional models.

The next spatial scale of my research focuses on mathematical and computational modeling of time perception. I collaborate with Dr. C. Buhusi, currently at Utah State University, on interval timing in mice to refine a neural network model called the striatal beat frequency (SBF) model explaining time perception.
Peak Interval Timing Procedure. Mice/rats are conditioned to press a lever and receive a reward at or shortly after a given duration, e.g., 30 s. After successful training, the animals are presented with the same conditioning stimulus (light or sound cue) but no reward. In the absence of the reward, the mice/rats pressed the lever and waited for the reward. The lever presses' distribution was bell-shaped and centered on the training time, e.g., 30 s.

Hippocampus (HIP) Lesions and Time Scale Invariance. We proved mathematically and through numerical simulations that the interval timing should remain scalar after HIP lesions within reasonable experimental limits. However, we also predicted nonlinear responses that depend on the type and intensity of biological noise. Tristan Aft, Sorinel A. Oprisan, Catalin V. Buhus (2021) Is the scalar property of interval timing preserved after hippocampus lesions?, Journal of Theoretical Biology, Volume 516, 7 May 2021, 110605 (DOI: 10.1016/j.jtbi.2021.110605).

Hippocampus Time Cells and Interval Timing. We modeled the recently-discovered HIP time cells and proved mathematically and through numerical simulations that a population model with diffusive coupling could learn and reproduce accuartely any durations. Oprisan SA, Buhusi M, Buhusi CV (2018) A population-based model of the temporal memory in the hippocampus, Frontiers in Neuroscience 12: 521 (DOI: 0.3389/fnins.2018.00521).

Topological Organization of Hippocampus. We proved mathematically and through numerical simulations that the peak of the interval timing should shift according to dorsal/ventral side of the HIP lesion. Oprisan SA, Aft T*, Buhusi M, Buhusi CV (2017) Scalar timing in memory: A temporal map in the hippocampus, Journal of Theoretical Biology, 438: 133-142 (DOI: 10.1016/j.jtbi.2017.11.012).

Critical Role of Noise in Interval Timing Network
We found theoretically and checked numerically that biological noise is essential for preserving interval timing's scalar property. The scalar invariance property means that recalling longer time intervals is proportionally less precise, a property known as Weber's law. Oprisan SA and Buhusi CV (2014) What is all the noise about in interval timing? Philosophical Transactions of the Royal Society B: Biological Sciences, 396(1637): 20120459 (DOI: 10.1098/rstb.2012.0459).
We investigated two sources of noise/errors that affect time perception:



Phase Resetting Mechanism in Interval Timing Neual Networks
We identified a celular-level mechanism of timing with intruders (stop/reset) that uses phase resetting and postinhibitory rebound. Oprisan SA, Dix S*, and Buhusi CV (2014) Phase resetting and its implications for interval timing with intruders, Behavioral Processes, 111: 146-153 (DOI: 10.1016/j.beproc.2013.09.005).

Noise can Identify Timing Network Structure
We proved mathematically that the memory variance produces a skewed bell-shaped response curve similar to experimental observations. This means that it is more likely to overestimate than underestimate the elapsed time between events. Oprisan SA and Buhusi CV (2013) How noise contributes to time-scale invariance of interval timing, Physical Review E, 87(5): 052717 (DOI: 10.1103/PhysRevE.87.052717).

Modeling the Pharmacology of Interval Timing
Our SBF model correctly predicts the compressed temporal horizon induced by amphetamines that lead, for example, to temporal discounting, i.e., preference of sooner smaller rewards over more significant later rewards. Oprisan SA and Buhusi CV (2011), Modeling pharmacological clock and memory patterns of interval timing in a striatal beat-frequency model with realistic, noisy neurons. Frontiers in Integrative Neuroscience, 5:52 (DOI: 10.3389/fnint.2011.00052).

Striatal Beat Frequency Model (SBF). We proved mathematically and through numerical simulations that the interval timing should remain scalar after HIP lesions within reasonable experimental limits. We tested the validity of our SBF model's predictions experimentally under normal and pharmacologically manipulated time perception conditions.

My theoretical/computational research at the cell and network levels of neuroscience, along with recent results from my collaboration with Dr. Buhusi, place my lab at the forefront of computational modeling of timing. However, many challenging questions persist.

In collaboration with the electrophysiology laboratory of Dr. Lavin at MUSC, we develop a new mathematical model of pyramidal cells using medial prefrontal cortex (mPFC) recordings.

Since 2005, optogenetics' novel technique gave neuroscientists access to controlling neurons and neural networks' activity using light by insertion into neurons of genes that confer light responsiveness. The mPFC contains tens of thousands of neurons, each of them described by complex nonlinear equations. As a result, we would expect a mathematical model with many variables.

We used nonlinear dynamics tools and found that the complex dynamics of mPFC in response to light stimuli could be embedded in a three-dimensional space. This unexpected result (given the vast number of degrees of freedom of the system) opens the possibility of deriving a low-dimensional phenomenological model of mPFC.

Our results are relevant because the gamma rhythm of the brain (25 Hz – 100 Hz) involves the reciprocal interaction between interneurons, mainly parvalbumin (PV+), fast-spiking interneurons (FS PV+), and principal cells that we record from. Gamma oscillations show strong coherence across different brain areas during associative learning and successful recollection. Gamma oscillations are also essential for memory maintenance.

Our study could also improve treatment for different neuropathologies related to gamma rhythm dysfunctions (cognitive inflexibility, schizophrenia, and autism).

Some of the remaining clalleges are

In collaboration with the lab of Drs. Priyattam J. Shiromani and Carlos Blanco-Centurion at MUSC we explored the arhitecture of neural networks involved in wake-sleep cycle through calcium imaging.

Hypothalamus Subnetworks from Correlated Calcium Fluorescence. We found that the cross-correlation of calcium fluorescence signal reveals the neural networks during exploration and REM sleep. Some neurons are active during both tasks, which indicates a common/overlapping function of such neurons. Blanco-Centurion C, Luo S, Spergel DJ, Vidal-Ortiz A, Oprisan SA, Van den Pol AN, Liu M, and Shiromani PJ (2019) Dynamic Network Activation of Hypothalamic MCH Neurons in REM Sleep and Exploratory Behavior, Journal of Neuroscience (DOI: 10.1523/JNEUROSCI.0305-19.2019).

Biomedical Imaging. The bio-signal processing and brain-computer interfaces have recently moved to the forefront of biomedical research as an effective tool for interfacing prosthetic devices controlled by our brains. My computational lab uses a serties of biomedical devices for curent research projects: We explore computational algorithms using EEG signals to control, for example, the mouse's movements on a computer screen and typing texts. I intend to use the insight gained from the large-scale modeling of interval timing neural networks to investigate the feasibility of a hardware implementation of dedicated neural networks that could be interfaced with our brain.

Professional Development

Research Funding for Undergraduate Computational Neuroscience at the College of Charleston

I received a CAREER award from the National Science Foundation
"CAREER: The Faculty Early Career Development (CAREER) Program is a Foundation-wide activity that offers the National Science Foundation's most prestigious awards in support of junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organizations."
Award #1054914
CAREER:Prediction of Synchrony and Phase-Locked Modes in Neural Networks based on Stimulus Time Resetting Curve
ABSTRACT
Nervous systems are composed of complex networks of neurons, which involve chemical and electrical signal lines. These neural networks send signals within the brain and muscles controlling bodily actions such as walking and breathing. Although each neuron is also complex, a given cell can be modeled as a pacemaker that fires at regular intervals to study firing patterns in large neural networks. When a neuron receives signals or inputs from other neurons, it maps them into measurable changes of its firing activity or signaling rate. It passes this output to the next neuron. This research aims to understand how the input-output, or resetting curve, of individual neurons and their coupling, can generate complex firing patterns in large networks. New computer algorithms will be developed for classifying and storing resetting curves generated by the model and experimental neurons. The resetting curves will predict the spectrum of possible firing patterns of neural networks made of such neurons. The stability of numerically predicted firing patterns and the mechanisms leading to the firing mode switch will be experimentally tested.
The broader impacts of this project include possible new solutions for predicting the emergent coherent firing pattern and its stability, at the network level, based on the resetting curves of individual neurons. Studying how the nervous system processes and responds to external stimuli will aid our understanding of how neural networks function, including the human nervous system and its related disorders. Educationally, the project will attract undergraduate students into the interdisciplinary field of computational neuroscience. A seminar in Computational Biology for incoming first-year students coupled with an upper-level class exploring computational models of neurons and their networks will offer a mentored pathway to related careers. Undergraduate students will also benefit from in-depth research experience throughout the project.