April 13, 2021

MIT Project Helps ALS Patients Communicate Using Sensors

4 min read
Researchers at MIT (Cambridge, Mass.) have developed thin, wearable, piezoelectric sensors that can reliably decode...

Researchers at MIT (Cambridge, Mass.) have developed thin, wearable, piezoelectric sensors that can reliably decode facial strains and predict facial kinematics.

The technology is being tested on patients suffering from amyotrophic lateral sclerosis (ALS, also known as motor neurone disease, or MND), who, over the course of their decline, lose all ability to control their muscles, and thus their capability to communicate.

The sensors can be easily attached to a patient’s face and thus measure minuscule movements such as a smile or twitch. In this way, they allow patients to communicate in a more natural way, without the necessity for bulky, expensive, and often extremely difficult to operate equipment, as is unfortunately the norm today.

ALS communication. (Image source: MIT)

The movements can then be interpreted by the device. The sensors are malleable and soft, and can thus be made almost invisible. Perhaps even more importantly, the researchers suggest that since all the components used are easy to manufacture in volume, the device could typically cost only about $10.

Canan Dagdeviren, the LG Electronics Career Development Professor of Media Arts and Sciences — who is leading the project at MIT’s Conformable Decoders group — said the technology promises to have significant advantages over the existing methods deployed that rely on measurements of electrical activity of the nerves that control facial muscles.

“These devices are very hard, planar, and boxy, and reliability is a big issue. You may not get consistent results, even from the same patients within the same day,” said Dagdeviren.

Other groups involved in the project involve scientists from the University of Buffalo and the A*STAR Institute of Microelectronics in Singapore.

The initial results of the work were reported in a recent edition of Nature Biomedical Engineering, where the researchers suggest the process is 75% accurate at distinguishing between three facial expressions: smile, pursed lips, and open mouth.

Early work focused on testing two ALS patients — one female, and a male, for gender balance — and the results reported suggest the devices can accurately distinguish between these three facial expressions.

The device consists of four piezoelectric sensors embedded in a thin silicone film. The sensors, made from aluminium nitride, can detect mechanical deformation of the skin and convert it into an electrical voltage that can be easily measured.

(Image source: MIT)

The researchers deployed a process called digital imaging correlation on healthy volunteers to help them select the most useful locations to place the sensors.

They painted a random black-and-white speckle pattern on the face and then took numerous images of the area with multiple cameras as the subjects performed facial motions such as smiling, twitching the cheek, or mouthing the shape of certain letters. The images were then processed by special software that analyzes exactly how the tiny dots move in relation to each other, and thus determine the amount of strain experienced in a single area.

In this way, a library of phrases or words could be created that correspond to different combinations of movements, the researchers said.

“We can create customizable messages based on the movements that you can do,” said Dagdeviren.

“You can technically create thousands of messages that right now no other technology is available to do. It all depends on your library configuration, which can be designed for a particular patient or group of patients.”

The data from the sensor is then sent to a hand-held processing unit, which analyzes it using the algorithm that the researchers trained to distinguish between facial movements.

The researchers say in the current prototype, the unit is wired to the sensor, but in the future the connection could also be made wireless for easier use.

One member of the MIT research team, Farita Tasnim, was inspired to develop the sensor after meeting Professor Stephen Hawking, who suffered from MND from his early 20’s and at the end, relied on infrared sensors to direct movements of his cheeks and direct a cursor across a computer screen. This, Tasnim noted, was a difficult, frustrating, and time-consuming process that needed a lot of bulky equipment, and she was determined to come up with an improved process.

“There are a lot of clinical trials that are testing whether or not a particular treatment is effective for reversing ALS, said Tasnim. “Instead of just relying on the patients to report that they feel better or feel stronger, this device could give a quantitative measure to track the effectiveness.”

MND Association is a charity focused on improving access to care, research, and campaigning for people affected by motor neurone disease.

The post MIT Project Helps ALS Patients Communicate Using Sensors appeared first on EETimes.

Source link

Copyright © All rights reserved. | Newsphere by AF themes.