Young engineer Sam Parker is a 2020 John Monash Scholar who wants to transform the field of brain-computer interfaces.
Early in his teens, Sam Parker started to think about ailments like motor-neurone disease, amyotrophic lateral sclerosis and serious spinal injuries.
He thought of rugby league player Alex McKinnon, physicist Stephen Hawking, and an acquaintance who became paralysed following a bicycle accident: each a reminder that life can be randomly unkind to a body.
“There’s such an injustice in that, I’m living a relatively healthy life and these people can have their lives [impaired] by these terrible diseases or accidents,” Parker told create.
“I thought, how can I try and fix that gap?”
Parker was raised by a lawyer-teacher and a civil and environmental engineer. He grew up fascinated with technical subjects like space travel, was gifted in mathematics and physics, and was focused on a future involving problem-solving.
One problem continued to stand out.
He considered training as a surgeon to implant neural devices that could restore mobility. He learned that the problem was a little trickier than first imagined — and also that no such devices even existed, at least outside of cutting-edge experimental work.
“It seemed to me like there’s this breakdown in the signalling pathway, and when the electrical signal has a breakdown in its pathway, we just patch it in a different route. Put a patch code in and just bypass the break,” he said of his early naivety,” he said.
“Why can’t we do the same thing with people’s spinal cord?’ was my initial thought. Turns out it’s a little more complicated. But that’s why I’m doing the research.”
Decoding messages
Parker is an electrical and electronics engineering honours student at the University of Newcastle. His project involves designing a wireless electroencephalogram (EEG) receiver with an EEG headset, measuring action potentials — the encoded electrical signals going down a nerve pathway. The signals are decoded by the technology and turned into movement.
“I’m trying … to decode those signals and work out what I’m trying to do,” he said of using brain activity to move a robotic hand.
“That’s involving a lot of analysis, signal processing, machine learning, and embedded systems work, with the eventual outcome being controlling a 3D-printed prosthetic hand.”
Parker’s work brought him to the University of Pittsburgh’s Rehabilitation and Neural Engineering Laboratories — a site of excellence in neuro-prosthetic research — on a Global E3 exchange program in 2017.
It also earned him a 2020 John Monash scholarship to further his BCI work.
He plans to return to the US after finishing his honours mid-next year, with Massachusetts Institute of Technology, University of Pittsburgh, and Brown University under consideration. The latter holds special appeal for him, because it is part of the BrainGate collaborative project.
Last year, Brown announced that, by using microelectrode arrays in the motor cortex of three clinical trial patients, a team enabled tetraplegics to operate a Google Nexus 9 tablet just by “thinking about the movement of their own arm or hand”.
Planes and brains
Parker also qualified for a five-month internship at NASA’s Edwards Air Force Base in 2018. He worked as a data analyst and test engineer as part of the team on the X-57 electric propulsion jet.
The main goal of the X-57 program is to share information with regulators to help develop certification approaches for battery-powered planes.
The distributed electric propulsion, retrofitted Tecnam P2006T, had 14 engines, 12 high-lift motors along the leading edge of the wing and two large wingtip cruise motors. Parker was involved with testing the cruise motors.
“My role was to say, ‘Okay, here’s the test data. How can we cross-check it against the standards to make sure that we’re doing what we say we’re doing?’ It’s a lot of looking at all this data we’re recording,” he recalls.
“It was an exercise not only in my technical ability — in my writing signal processing software — but also how do I communicate those findings in a way that a lot of people are going to understand, because a lot of engineers are from different backgrounds.”
The project won Parker his university’s Intern of the Year award and provided a reminder of how important it is to be able to share his work with people from a different professional background.
For a future that looks likely to involve awesomely complex, highly multidisciplinary work and operating at the very edge of what’s possible, this is not a trivial lesson.
Thinking through the challenges
Learning how to read minds is a daunting project, but the benefit to humanity could be vast. One of the innumerable challenges to overcome is the variation between people, and even between the same person at different times.
It’s a royal headache, but Parker says bring it on.
“Your brain is very plastic and there’s a lot of variation between my brain and your brain for example, or your brain and the person standing next to you. It’s not very easy to create a decoder, which is going to work person to person, but also [for] a person to the same person four hours later,” he explained.
“Your brain will change slightly, which means that we need to adjust the decoding
algorithm in almost real time. I need to have a time-variance decoding algorithm, which will be able to decode the brainwaves — hopefully without significant calibration — every four hours or so.”