Medical device and MedTech insights, news, tips and more
First Portable Brain-Computer Interface to Control Wheelchairs, TVs, Computers
September 25, 2019
Brain-computer interfaces have the potential to give severely disabled people the ability to easily control their wheelchairs, televisions, and other devices. Existing technologies suffer from a number of limitations, though, making them impractical for real-world applications.
One is that non-invasive brain wave monitoring currently requires large and uncomfortable electroencephalography caps with wet electrodes, wires, and associated adhesives. All this can be difficult and cumbersome, unlike simply putting on a hat and having things work right away.
Now, researchers at Georgia Institute of Technology, University of Kent in the UK, and Wichita State University in Kansas have worked together to develop the first truly portable, comfortable, and wireless brain-computer interface. Already tested in six healthy human volunteers, the technology has clear potential for direct brain-controlled manipulation of wheelchairs and other devices in patients badly needing it.
The system brings together flexible electronics, nanomembrane electrodes, and a deep learning algorithm to sense relevant brain waves and accurately translate their meaning. As with similar systems, the new brain-computer interface relies on classifying signals generated from visually evoked potentials as users look at a flashing screen.
“This work reports fundamental strategies to design an ergonomic, portable EEG system for a broad range of assistive devices, smart home systems and neuro-gaming interfaces,” said Woon-Hong Yeo, an assistant professor at Georgia Tech. “The primary innovation is in the development of a fully integrated package of high-resolution EEG monitoring systems and circuits within a miniaturized skin-conformal system.”
The package consists of a headband with dry electrodes that touch the scalp, even in the presence of hair, a nanomembrane electrode placed just under the skin, flexible electronics for power and control, and an algorithmic deep learning neural network running within the electronics to interpret the signals.
“Deep learning methods, commonly used to classify pictures of everyday things such as cats and dogs, are used to analyze the EEG signals,” said Chee Siang (Jim) Ang, senior lecturer in Multimedia/Digital Systems at the University of Kent. “Like pictures of a dog which can have a lot of variations, EEG signals have the same challenge of high variability. Deep learning methods have proven to work well with pictures, and we show that they work very well with EEG signals as well.”
There’s a Bluetooth chip built-in that provides wireless communication, allowing the system to easily connect to a wide variety of devices.
Written by: Medgadget Editors
Legacy MedSearch has more than 30 years of combined experience recruiting in the medical device industry. We pride ourselves on our professionalism and ability to communicate quickly and honestly with all parties in the hiring process. Our clients include both blue-chip companies and innovative startups within the MedTech space. Over the past 10 years, we have built one of the strongest networks of device professionals ranging from sales, marketing, research & , quality & regulatory, project management, field service, and clinical affairs.
We offer a variety of different solutions for hiring managers depending on the scope and scale of each individual search. We craft a personalized solution for each client and position with a focus on attracting the best possible talent in the shortest possible time frame.
Are you hiring?
Contact us to discuss partnering with Legacy MedSearch on your position.