
Humans have a sophisticated somatosensory system (system of body sensors) which performs a variety of tasks we take for granted. Not only does it provide our motor system with feedback that enables fine motor control, but it also connects our brains to the outside world by providing us with the capacity to explore our environment as well as enjoy a physical connection with others.
While there have been recent outstanding advancements in motor prosthetics which can read brain signals and translate these into complex movement of a robotic arm, the loss of somatosensation limits the success of these motor prostheses. We are yet to work out how to close the loop with sensory feedback which is necessary to inform the brain of the position and textural experiences which drive our motor system to react appropriately. However, even with motor function, those who lose somatosensation (for example due to spinal cord injury or nerve damage) become disconnected from others including loved ones. Imagine what it must be like to shake someone’s hand, or picking up a small child, after your hand is replaced with a mechanical device; not only would you have poor grip, but you would miss out on the experience of the warmth and softness of human contact.
The focus of our research is to “close the loop” on sensorimotor control. We use a combination of electrophysiology, signal processing and machine learning in small animal models to discover how sensory information is coded in different parts of the central nervous system. We are looking at certain brain regions for the potential to replaced lost sensory information with electrical inputs from a prosthetic device (neural prosthesis). More information can be found here: https://doi.org/10.3389/fnins.2020.00156
One of our brain regions of interest is the dorsal column nuclei (DCN), where somatosensory information from the body is “summarised” before it is sent to higher brain regions for further processing. By stimulating the body and recording multiple electrical signatures in the DCN, we have been able to extract information and use machine learning to inform us of: i) the location, and ii) the quality of a sensory event experienced on the body. By reading the DCN, we can already successfully discriminate between various somatosensory modalities such as different tactile stimuli as well as changes in limb position.
Another key area of interest is how somatosensory information is processed and distributed within the brain and spinal cord. Pain, tactile and proprioceptive information are segregated within the spinal cord, but during injury, problems with cross-talk between these systems can arise. Our goal is to improve our understanding of how these different somatosensory sub-modalities are dealt with by the spinal cord, so that pain can be turned down, but without affecting the other somatosensations we need to function normally.