My research is in computational neuroscience, an interdisciplinary approach to analyzing, modeling, and understanding cells and neural networks in the brain.
Computational neuroscience is an interdisciplinary science that links neuroscience, cognitive science, and psychology with electrical engineering, computer science, mathematics, and physics. Computational neuroscience is grounded in neurobiology and models biological systems at multiple spatial-temporal scales, from membrane currents protein and chemical coupling to network oscillations, columnar and topographic architecture, and learning and memory. Computational neuroscience focuses on the neurobiology of brain functions and significantly differs from engineering approaches in machine learning, neural networks, and statistical learning theory. In 1985 Eric L. Schwartz used "computational neuroscience" for the first time to summarize work done in neural modeling, brain theory, and neural networks.
One of my research's main themes in computational neuroscience is on deriving mathematical criteria for neural network stability using the phase resetting curve method. There are billions of neurons in our brain with different morphologies and functions expressing different ionic channels. Neurons are connected in neural networks and communicate using electrical and chemical signals. Despite variations, neural networks exist across different species that perform the same function, e.g., the circadian rhythm network.
Neural networks can generate stable, phase-locked rhythms that are essential for biological processes. Synchrony, the best-known example of phase-locked activity in neural networks, is the foundation of complex biological phenomena such as memory, facial recognition, circadian rhythms, and epileptic seizures.
A simplified yet robust modeling approach to phase-locked mode prediction assumes that the individual neurons are characterized only by the amount of delay or advance induced in their firing frequency due to external perturbations, the so-called phase resetting curve (PRC) theory. The stable oscillatory activity of neurons is represented in the space of their state variables, such as the membrane voltage or the ionic conductances, by a closed trajectory called a limit cycle. During one cycle of activity, the figurative point that describes a neuron's state moves along the entire limit cycle with a speed that is always tangent.
I investigated the stability of neural networks to external perturbations and proposed new analytical and computational techniques in this field. There are many reliable PRC predictors for tangent perturbations to the limit cycle, which instantly move a figurative point forward/backward along the steady phase space trajectory, including some developed by our research group. The effect of normal perturbations to the limit cycle, which moves a figurative point outside the unperturbed phase space trajectory, is tough to predict.
I used a unitary transform to the moving reference frame of the figurative point that leaves only two possible directions relative to the unperturbed trajectory for planar limit cycles, i.e., normal or tangent. As a result, there are four possible perturbations and their corresponding effects combinations. I found that a tangent perturbation produces no normal displacement and analytically determined the other three coupling parameters. Our previous results confirm the tangential effect (speeding up or slowing down of neural oscillators) in response to tangent perturbations. Another significant achievement was that the results I found apply to arbitrary stimulus shapes.
The recent results regarding the improved prediction of phase resetting curve based on Floquet exponents in the limit cycle are moving reference frames that allow me to explore high-dimensional models.
Could the coupling matrix be reduced to a diagonal form in higher dimensions as we showed that it is the case in two dimensions?
How can the mathematical/analytical prediction be extended to arbitrary stimulus shapes?
Can the convolution-based approach to PRC and the infinitesimal PRC is merged into a coherent mathematical framework for all types of stimuli?
The next spatial scale of my research focuses on mathematical and computational modeling of time perception. I collaborate with Dr. C. Buhusi, currently at Utah State University, on interval timing in mice to refine a neural network model called the striatal beat frequency (SBF) model explaining time perception.
Peak Interval Timing Procedure.
Mice/rats are conditioned to press a lever and receive a reward at or shortly after a given duration, e.g., 30 s. After successful training, the animals are presented with the same conditioning stimulus (light or sound cue) but no reward. In the absence of the reward, the mice/rats pressed the lever and waited for the reward. The lever presses' distribution was bell-shaped and centered on the training time, e.g., 30 s.
Striatal Beat Frequency Model (SBF).
We proved mathematically and through numerical simulations that the interval timing should remain scalar after HIP lesions within reasonable experimental limits. We tested the validity of our SBF model's predictions experimentally under normal and pharmacologically manipulated time perception conditions.
My theoretical/computational research at the cell and network levels of neuroscience, along with recent results from my collaboration with Dr. Buhusi, place my lab at the forefront of computational modeling of timing. However, many challenging questions persist.
How are resources allocated in a finite system (the brain)?
How are the resources allocated between sub-networks?
Is there a limit to the number of time intervals one can discriminate?
Is the temporal discrimination based on a fixed number of neural oscillators for the entire cortico-striatal network, or does the quality of temporal discrimination dynamically determine it?
In collaboration with the electrophysiology laboratory of Dr. Lavin at MUSC, we develop a new mathematical model of pyramidal cells using medial prefrontal cortex (mPFC) recordings.
Since 2005, optogenetics' novel technique gave neuroscientists access to controlling neurons and neural networks' activity using light by insertion into neurons of genes that confer light responsiveness. The mPFC contains tens of thousands of neurons, each of them described by complex nonlinear equations. As a result, we would expect a mathematical model with many variables.
We used nonlinear dynamics tools and found that the complex dynamics of mPFC in response to light stimuli could be embedded in a three-dimensional space. This unexpected result (given the vast number of degrees of freedom of the system) opens the possibility of deriving a low-dimensional phenomenological model of mPFC.
Our results are relevant because the gamma rhythm of the brain (25 Hz – 100 Hz) involves the reciprocal interaction between interneurons, mainly parvalbumin (PV+), fast-spiking interneurons (FS PV+), and principal cells that we record from. Gamma oscillations show strong coherence across different brain areas during associative learning and successful recollection. Gamma oscillations are also essential for memory maintenance.
Our study could also improve treatment for different neuropathologies related to gamma rhythm dysfunctions (cognitive inflexibility, schizophrenia, and autism).
Some of the remaining clalleges are
can we combine nonlinear dynamics and bootstrap method to infer a low-dimensional mathematical model?
would multi-electrode recordings enhance the predictive power of teh model?
why is variance in memory tasks decreaseing under amphetamines?
In collaboration with the lab of Drs. Priyattam J. Shiromani and Carlos Blanco-Centurion at MUSC we explored the arhitecture of neural networks involved in wake-sleep cycle through calcium imaging.
Biomedical Imaging. The bio-signal processing and brain-computer interfaces have recently moved to the forefront of biomedical research as an effective tool for interfacing prosthetic devices controlled by our brains.
My computational lab uses a serties of biomedical devices for curent research projects:
BIOPAC MP 150 Data Acquyisition system for ECG/EEG
Emotive Epoc wearable EEG
BIOPAC Functional Near Infrared Spectroscopy
Tobii Eye Tracking Device
We explore computational algorithms using EEG signals to control, for example, the mouse's movements on a computer screen and typing texts. I intend to use the insight gained from the large-scale modeling of interval timing neural networks to investigate the feasibility of a hardware implementation of dedicated neural networks that could be interfaced with our brain.
Professional Development
Review Editor, Frontiers in Cellular Neuroscience(2020-)
Editor-in-Chief, WSEAS Transactions on Systems and Control since May 2014 (a two year term).
Editor, CUR Quarterly (the official journal of the Council on Undergraduate Research)
Journal of Computational Neuroscience, Mathematical Biology, Theoretical Biology, Biophysical Journal, Neurocomputing, Journal of Neurophysiology, and BMC Neuroscience
CUR Physics Program Review chair (2013)
NSF grant reviewer
Created teaching videos for intro/general physics labs posted on Kaltura since 2015:
I received a CAREER award from the National Science Foundation
"CAREER: The Faculty Early Career Development (CAREER) Program is a Foundation-wide activity that offers the National Science Foundation's most prestigious awards in support of junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organizations."
Award #1054914 CAREER:Prediction of Synchrony and Phase-Locked Modes in Neural Networks based on Stimulus Time Resetting Curve
ABSTRACT
Nervous systems are composed of complex networks of neurons, which involve chemical and electrical signal lines. These neural networks send signals within the brain and muscles controlling bodily actions such as walking and breathing. Although each neuron is also complex, a given cell can be modeled as a pacemaker that fires at regular intervals to study firing patterns in large neural networks. When a neuron receives signals or inputs from other neurons, it maps them into measurable changes of its firing activity or signaling rate. It passes this output to the next neuron. This research aims to understand how the input-output, or resetting curve, of individual neurons and their coupling, can generate complex firing patterns in large networks. New computer algorithms will be developed for classifying and storing resetting curves generated by the model and experimental neurons. The resetting curves will predict the spectrum of possible firing patterns of neural networks made of such neurons. The stability of numerically predicted firing patterns and the mechanisms leading to the firing mode switch will be experimentally tested.
The broader impacts of this project include possible new solutions for predicting the emergent coherent firing pattern and its stability, at the network level, based on the resetting curves of individual neurons. Studying how the nervous system processes and responds to external stimuli will aid our understanding of how neural networks function, including the human nervous system and its related disorders. Educationally, the project will attract undergraduate students into the interdisciplinary field of computational neuroscience. A seminar in Computational Biology for incoming first-year students coupled with an upper-level class exploring computational models of neurons and their networks will offer a mentored pathway to related careers. Undergraduate students will also benefit from in-depth research experience throughout the project.