Investigating the links between facial recognition and Alzheimer’s disease
In recent years Alzheimer’s disease has been on the rise throughout the world and is rarely diagnosed at an early stage when it can still be effectively controlled. Using artificial intelligence, KTU researchers conducted a study to identify whether human-computer interfaces could be adapted for people with memory impairments to recognize a visible object in front of them.
Rytis Maskeliūnas, a researcher at the Department of Multimedia Engineering at Kaunas University of Technology (KTU), considers that the classification of information visible on the face is a daily human function: “While communicating, the face ‘tells’ us the context of the conversation, especially from an emotional point of view, but can we identify visual stimuli based on brain signals?”
The visual processing of the human face is complex. Information such as a person’s identity or emotional state can be perceived by us when analyzing their face. The aim of the study was to analyze a person’s ability to process contextual information from the face and detect how a person responds to it.
Face can indicate the first symptoms of the disease
According to Maskeliūnas, many studies demonstrate that brain diseases can potentially be analyzed by examining facial muscle and eye movements since degenerative brain disorders affect not only memory and cognitive functions, but also the cranial nervous system associated with the above facial (especially eye) movements.
Dovilė Komolovaitė, a graduate of KTU Faculty of Mathematics and Natural Sciences, who co-authored the study, shared that the research has clarified whether a patient with Alzheimer’s disease visually processes visible faces in the brain in the same way as individuals without the disease.
“The study uses data from an electroencephalograph, which measures the electrical impulses in the brain,” says Komolovaitė, who is currently studying for a master’s degree in Artificial Intelligence program at the Faculty of Informatics.
In this study, the experiment was performed on two groups of individuals: healthy and affected by Alzheimer’s.
“The brain signals of a person with Alzheimer’s are typically significantly noisier than in a healthy person,” says Komolovaitė, emphasizing that this correlates with a reason which makes it more difficult for a person to focus and be attentive when experiencing the symptoms of Alzheimer’s.
Photos of people’s faces were shown during the study
The study selected a group of older people made up of women over 60 years of age: “Older age is one of the main risk factors for dementia, and since the effects of gender were noticed in brain waves, the study is more accurate when only one gender group is chosen.”
During the study, each participant performed experiments lasting up to an hour, during which the photos of human faces are shown. According to the researcher, these photos were selected according to several criteria: in the analysis of the influence of emotions, neutral and fearful faces are shown, while analyzing the familiarity factor, known and randomly chosen people are indicated to the participants of the study.
In order to understand whether a person sees and understands a face correctly, the participants of the study were asked to press a button after each stimulus to indicate whether the face shown is inverted or correct.
“Even at this stage, an Alzheimer’s patient makes mistakes, so it is important to determine whether the impairment of the object is due to memory or vision processes,” says the researcher.
Inspired by real-life interactions with Alzheimer’s patients
Maskeliūnas reveals that his work with Alzheimer’s disease started with his collaboration with the Huntington’s Disease Association, which opened his eyes to what these many neurodegenerative diseases really look like.
The researcher also had direct contact with Alzheimer’s patients: “I saw that the diagnosis is usually confirmed too late when the brain is already irreversibly damaged. Although there is no effective cure for this disease, the process can be paused and sustained by gaining some healthy years of life.”
Today, we can see how human-computer interaction is adapted to alleviate the life of people with physical disabilities. Controlling a robotic hand by “thought” or a paralyzed person writing a text by imagining letters is not a new concept. Still, trying to understand the human brain is probably one of the most challenging tasks remaining today.
In this study, the researchers worked with the data from the standard electroencephalograph equipment, however, Maskeliūnas emphasizes that in order to create a practical tool, it would be better to use data gathered from invasive microelectrodes, which can more accurately measure the activity of neurons. This would increase the quality of the AI model substantially.
“Of course, in addition to the technical requirements, there should be a community environment focused on making life easier for people with Alzheimer’s disease. Still, in my personal opinion, after five years, I think we will still see technologies focused on improving physical function, and the focus on people affected by brain diseases in this field will only come later,” says Maskeliūnas.
According to the master’s student Komolovaitė, a clinical examination with the help of colleagues in the field of medicine is necessary, therefore this stage of the process would take a lot of time: “If we want to use this test as a medical tool, a certification process is also needed.”
Source: Read Full Article