New algorithm analyses tongue to predict diabetes, stroke with 98 pc accuracy
New Delhi, Aug 13 (IANS) Researchers have developed a novel computer algorithm that can predict various diseases like diabetes or stroke, just by analysing the colour of the human tongue with 98 per cent accuracy.
The imaging system developed by Middle Technical University (MTU) and the University of South Australia (UniSA) in Australia can diagnose conditions such as diabetes, stroke, anaemia, asthma, liver and gallbladder issues, Covid-19, and other vascular and gastrointestinal diseases.
“The colour, shape, and thickness of the tongue can reveal a litany of health conditions,” said Ali Al-Naji, adjunct Associate Professor at MTU and UniSA.
“Typically, people with diabetes have a yellow tongue; cancer patients a purple tongue with a thick greasy coating; and acute stroke patients present with an unusually shaped red tongue,” he added.
The breakthrough was achieved through a series of experiments using 5,260 images to train machine-learning algorithms to detect tongue colour.
Researchers received 60 tongue images from two teaching hospitals in the Middle East, representing patients with diverse health conditions. The AI model matched tongue colour with the correct disease in nearly all cases.
The paper published in Technologies describes how the system analyses tongue colour to provide real-time diagnoses, demonstrating that AI can advance medical practices significantly.
Al-Naji explained that AI is replicating a 2,000-year-old technique from traditional Chinese medicine, where the tongue’s colour, shape, and thickness are used to diagnose health issues.
For example, people with diabetes typically have a yellow tongue, while cancer patients show a purple tongue with a thick greasy coating. Stroke patients often present with an unusually shaped red tongue. A white tongue can indicate anaemia, severe Covid-19 cases are associated with a deep red tongue, and an indigo or violet tongue suggests vascular or gastrointestinal problems or asthma.
The study used cameras placed 20 centimetres from a patient to capture tongue colour, and the imaging system predicted health conditions in real time.
Co-author UniSA Professor Javaan Chahl noted that this technology could eventually be adapted for use with smartphones, making disease screening more accessible.
–IANS
ts/rvt/
Comments are closed.