What are the unmined frontiers of human knowledge?
As an adolescent, Wilsaan Joiner was asked this question by his father, but in classic parental fashion, his dad already had a couple of answers in mind. Space exploration was one.
“I hate to fly, so that was out,” said Joiner. “And the other area he thought was understanding how the brain works. He strongly suggested that neuroscience was an area where there was a potential amount of room to explore and grow and really contribute.”
Today, Joiner finds himself completing his first six months on the faculty at UC Davis. An assistant professor in the Department of Neurobiology, Physiology and Behavior, he also holds an appointment in the School of Medicine’s Department of Neurology. His research explores sensorimotor integration: how sensory inputs, like vision, influence our motor actions and vice versa.
“We usually think of our senses as guiding our motor actions, but there are some parts of our brain that are actually using our motor actions to sort of guide our senses,” said Joiner. “Or at least, to combine them with our senses to form a better model or a better representation of ourselves in our environment.”
Another source of sensory information of the body in space is proprioception, and through his interest in athletics, Joiner started to wonder how athletes develop such fine motor skills and how those motor skills relate to how their brains process sensory information.
“Through multiple sources of information, ballet dancers have a profound representation of their body in space and they’re able to make very intricate, precise movements of their body,” said Joiner. “How do you take this information and basically create these extremely fine motor movements with such precision in a very, very short amount of time?”
Colleagues in the Joiner Lab seek to understand the neural mechanisms that underlie these fine motor movements. And their research isn’t just to satisfy curiosity. Figuring out how healthy brains integrate sensory information to inform movement and orientation in space could aid the development of new treatments for those with schizophrenia and help improve prosthetic designs for amputees.
Origins of a neuroscientist
As an undergraduate at Saint Louis University in Missouri, Joiner studied biomedical engineering, becoming one of the university’s first students in what was then a newly launched program. After graduating in 2001, he continued his studies in the field at The Johns Hopkins University School of Medicine in Baltimore, Md., earning a Ph.D. in Biomedical Engineering in 2007. Postdoctoral positions at Harvard University’s John A. Paulson School of Engineering and Applied Sciences and the National Institutes of Health followed.
At the NIH, Joiner studied visual perception, eye movement control and predictive planning in non-human primates. He was particularly interested in how the brain is capable of seamlessly integrating visual information and creating a crisp picture, despite our eyes constantly moving.
“If our visual system is kind of like a camera, then these movements that we make should disrupt the visual input we get all the time,” he said. “Anyone who has seen shaky camera video footage knows how hard it is to watch and make sense of what they’re seeing.”
But healthy brains and eyes don’t work that way. Under normal circumstances our brain does an amazing job compensating for the shakes—the eye movements. Together, the brain and eyes work in tandem to anticipate and compensate for these movements, creating a clear visual scene. During his research, Joiner found that if he temporarily silenced a group of neurons inside a part of the brain called the thalamus, healthy research subjects lost the ability to form a clear picture of their environment. The thalamus can be thought of as a weigh station for sensory information before it travels to other parts of the brain. The finding could have implications for those affected by schizophrenia.
“One of the thoughts for schizophrenia is that you have the reduced ability to make a distinction between internal and external sources of stimuli or information,” said Joiner. “Building off that is the thought that visual perception deficits may be because they can have a difficult time distinguishing changes in the environment that are caused by their own movements, so caused by their eyes movements, as opposed to actual changes in the external environment.”
Currently, Joiner is working on NIH-funded research at the UC Davis School of Medicine’s Imaging Research Center to explore whether the pathways he disrupted in non-human primates are similar to the pathways disrupted in the brains of humans with schizophrenia. The research could help explain why visual perception is disturbed in those afflicted with the condition.
Robotics to help amputees
Another research avenue Joiner is interested in exploring at UC Davis is how our brains create within us an understanding of where our limbs are in space. Inside his lab in the Life Sciences Building, Joiner and his colleagues have installed a robotic device that’ll help them study how people compensate for unexpected disruptions to limb movement. Controlled by research subjects via joystick-like handles, the robot collects information about how a subject moves their limbs under specific direction. It then introduces disruptions to those movements and collects information about how the participants respond to the disruptions. Joiner and colleagues have studied this thus far exclusively in participants with both arms.
“The question we want to ask now at Davis is what happens if you’re actually missing a limb,” he said. “If they’re using some prosthetic, do they have a sense of the movements that they’re making even with their residuum?”
Joiner hopes to perform studies that will help answer this question so he and his team can gain insights into how amputees control their residual muscles.
“Based on that control, can we then use some insight from that to build better prosthetics?” he wondered. “We can also ask similar questions about the normal motor system. For example, coupling our robotics with immersive virtual reality may provide a tractable way to manipulate the sensory information and examine changes in control. In the virtual reality environment we could change the length of the limb segments and quantify how subjects adjust their control.”