Duncan NRI Matthew J. McGinley, Ph.D.
Get to know Matthew J. McGinley, Ph.D.
Each moment our senses are bombarded with information from many sources. How do networks of neurons in the brain rapidly process this information in order to make sense out of the world and choose appropriate actions?
The McGinley lab approaches this question by studying neural mechanisms of auditory perceptual decision making behaviors, in mice. We primarily use whole-cell recording and two-photon imaging in head-fixed mice while they perform auditory perceptual decision-making tasks. The lab also utilizes computational and engineering approaches, optogenetics, multi-channel extracellular recording, pupillometry, and histological methods. We are currently focused on three related projects regarding the cellular, synaptic, and neuromodulatory mechanisms of auditory perceptual decision making:
Improved perceptual learning with peripheral nerve stimulation
The vagus nerve exerts widespread parasympathetic control over the body. Activation of sympathetic brain centers through back-stimulation of the vagus nerve has long been used to control seizures in epilepsy patients, and has more recently begun to be used as a therapy for a wide array of brain disorders, including tinnitus and depression. Despite its widespread and growing use, the neural mechanisms by which vagus nerve stimulation leads to improved brain activity patterns are largely unknown. As part of the Targeted Neuroplasticity Training (TNT) program funded by the Defence Advanced Research Projects Agency (DARPA), the McGinley lab is working to elucidate the neuromodulatory mechanisms in the auditory cortex by which vagus nerve stimulation may be used to alter the state of the brain in a manner that facilitates sensory learning. In addition, the McGinley lab is testing whether vagus nerve stimulation can be used to fend off seizure activity – and ameliorate the regression in learning and cognition – in an animal model of infantile spasms, a severe form of childhood epilepsy.
Neuromodulatory mechanisms of attentional effort
The internal state of the brain – such as arousal and stress level – is not constant, but rather it fluctuates continuously and sometimes rapidly. These fluctuations impact our ability to perform behavioral tasks, particularly when a task is challenging. We expend great effort to overcome these internal changes and meet behavioral demands. Attending to sounds in noisy environments – referred to as the ‘cocktail party problem’ – is a notoriously challenging listening task, particularly for individuals with hearing loss. The increased ‘listening effort’ with hearing loss is known to cause stress and exhaustion. However, the neural underpinnings of listening effort – and indeed of attentional effort in general – are not well understood. For example, which neuromodulators (such as norepinephrine or acetylcholine) are involved? And how do these neuromodulators impact cortical processing of sounds during heightened effort and optimal performance? To address these questions, we are developing a mouse model of attentional/listening effort. In order to manipulate the level of effort of the mice, the reward structure is varied in a challenging listening task. Some mice are subjected to noise-induced hearing loss, revealing the neural mechanisms used by these mice to overcome hearing loss and achieve performance.
Brain circuits for navigation in acoustic virtual reality
Spatial cues are prominent in sounds and are important for animals to segregate sound sources, such as when isolating a single voice in a crowd, or a predator in a forest. Degradation of auditory spatial perception is thought to contribute to the stress and strain in noisy environments experienced by individuals with hearing loss. Processing of spatial acoustic cues by the brainstem of animals in static environments has been studied intensely for decades. However, it is not known how these acoustic spatial cues are integrated and used by an animal while it moves in its environment. Inspired by recent seminal work in the visual system, we are developing a purely acoustic virtual reality environment the head-fixed rodents can ‘navigate.’ This approach opens up a wide range of fundamental questions about spatial hearing – and navigation, generally – to experimental investigation. Can animals navigate to rewarded locations using only auditory place cues? Is there plasticity in spatial processing by neurons when acoustically defined locations are rewarded? Which specific cues are necessary and sufficient? Does acoustically cued virtual navigation generate place fields in the hippocampus and grid fields in the entorhinal cortex? The McGinley lab is working to address these questions.