Researchers believe that the coronavirus changes how humans produce sounds, even when they are asymptomatic.
Gulf Today Report
Artificial intelligence is now able to decipher if people have the coronavirus simply by listening to their coughs.
The difference between coughs from someone with coronavirus and a healthy individual is something not discernable to the human ear, but is able to be ‘heard’ by a machine learning algorithm.
Researchers from MIT took thousands of samples of coughs and spoken words to train the artificial intelligence, which is now able to detect those with COVID-19 with 98.5 per cent accuracy.
Even for those who tested positive with the coronavirus, but showed no symptoms, the artificial intelligence detected them every time.
READ MORE
Do you fear you suffer from agoraphobia?
Good grooming helps relationships
What is the Basic skin care routine in winter
Before the pandemic, researchers were using similar technologies to detect signs of Alzheimer’s disease.
While Alzheimer is best known for harming humans’ memory, it also weakens the vocal chords.
The neural network ResNet50 — an algorithm that is designed to work similar to a human brain — was trained to discriminate sounds with different degrees of vocal cord strength.
Two more neural networks were trained to detect emotions in speech, such as frustration, happiness, and calmness, as well as to detect changes in lung and respiratory performance from coughs.
Combining all three of these models, as well as an algorithm to detect muscular degradation, gave the researchers an artificial intelligence model that could find Alzheimer’s samples — and one that could be adapted to diagnosing Covid-19.
“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing, and vice versa.
“It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state.” said co-author Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory.
“There’s in fact sentiment embedded in how you cough, so we thought, why don’t we try these Alzheimer’s biomarkers (to see if they’re relevant) for Covid.”
The researchers collected the “the largest research cough dataset that we know of,” Subirana said.
Approximately 2,500 samples were from confirmed coronavirus patients.
These samples, along with 1,500 others, were used to train the model.
The researchers found four biomarkers — vocal cord strength, sentiment, lung and respiratory performance, and muscular degradation — that are unique to Covid-19, which researchers believe means that the coronavirus changes how humans produce sounds, even when they are asymptomatic.
The researchers are working on incorporating the findings into an app, which would be subject to approval by the United States Food and Drug Administration before release.
If successful, users could cough into the phone and instantly get information about their COVID-19 status.
“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” Subirana said.