Even doctors need a second opinion sometimes, especially when diagnosing a patient with a condition that closely resembles other types of disorders. Jamileh Yousefi, a PhD student in the School of Computer Science, is developing an algorithm based on patient data that allows computers to help doctors distinguish between certain types of neuromuscular disorders.
“For some diseases, uncertainty causes incorrect diagnosis by doctors,” she says, “and in some domains, there is not enough knowledge available.” Diagnosing a patient with myopathy or neuropathy can be difficult because the symptoms are similar, she adds. Myopathies are a class of diseases that attack muscle tissue; neuropathies attack the nervous tissue controlling the muscles.
“We are using recently available quantitative measures for our diagnostic work, combined with artificial intelligence for this diagnosis,” says Yousefi.
Using her background in artificial intelligence, she is developing a program that can help doctors diagnose patients using a clinical decision support system.
The standard diagnostic tool for neuromuscular disorders is electromyography (EMG), which examines data produced by the electrical impulses in contracting muscle tissue. For Yousefi‘s research, the signals are transformed into a quantitative format, such as a table, and the computer is programmed to analyze seven features extracted from the signals, including amplitude, number of phases and spike duration. Most doctors, she says, are not trained to interpret this type of quantitative EMG data.
Using signals from patients with myopathy or neuropathy as an example, the system uses artificial intelligence to learn certain “rules,” which will then be employed by the clinical decision support system for diagnosis.
Although other diagnostic tools are available, there is room for improvement. Tools based on neural network models are more like “black boxes” that recommend a diagnosis without explaining to the doctor how it reached that conclusion, says Yousefi. The system accuracy rate could be more than 90 per cent, she adds, “but the doctor doesn’t know what’s the reasoning behind this decision.”
Other tools are in development, but are still quite complex. Yousefi says the system she’s working on is more transparent because it makes its diagnosis based on rules it has learned from training data and explains its diagnosis to the doctor.
She describes the algorithm as a combination of neural networks and fuzzy systems; the latter, she says, is ideal for dealing with uncertainty. “The algorithm we are writing can be generalized for other clinical decision support systems,” such as diagnosing other types of diseases or non-medical applications, such as analyzing risk in the financial sector.
Yousefi says computer-based decision support systems are not intended to replace human judgment but rather to enhance it. “We need human intuition” to interpret a computer’s results, she says. “A clinical decision support system can help you to make a decision, and at the end you decide if it’s good or not. It suggests or recommends decisions.”
Born in Iran, Yousefi has a master’s degree in artificial intelligence from Concordia University and studied software engineering at Carleton University. Her research is being supervised by Profs. Andrew Hamilton-Wright and Stefan Kremer in the School of Computer Science.