It probably surprises no one that artificial intelligence is now starting to play an important role in the diagnostic procedure of human diseases.
IBM recently showcased the ability of its supercomputer “Watson” to analyse medical images and suggest a diagnosis. Known as “cognitive computing” and natural language processing, Watson is able to look at medical images like x-rays, examine patient records and access scholarly articles as part of its diagnostic procedure. It has a digital library of 30 billion images it can use to compare and contrast to current patient x-rays as it makes its suggestions.
Artificial intelligence is already being used in other, more limited ways by technicians, but the new technology mimics the way that the human brain learns, so the computer becomes “trained” over time to make better and better recommendations.
What does this ultimately mean for patients?
On one hand, there’s a huge potential for improved patient care. Diagnosis can be faster and more precise and the machine may pick up on subtle things that a human being misses. It may help recommend the best course of treatment, statistically speaking.
On the other hand, the computer is a long way from making intellectual leaps as it pours over patient records and connecting the right pieces of critical information that can be hidden in a patient’s history. And critics are quick to point out that artificial intelligence is only as good as the algorithms they use.
There are possible legal complications as well. Right now, the failure to diagnose cancer or some other disease falls on the radiologist who misreads the MRI or x-ray films or the doctor who misreads the report and doesn’t examine the films. If a computer makes the diagnosis and recommendation for treatment and that diagnosis and treatment is wrong, who is ultimately responsible?
The odds are good that the patients might be caught in a legal no-man’s land for a while, where neither the manufacturer nor the physician want to take the blame for a misdiagnosis. The law may ultimately hold both responsible — dividing up liability in a traditional manner based on whether the error was due to a failure on the part of the technology itself or a failure on the part of those supervising the machines to double check results.
If you suffered an injury due to a radiologist’s error, an attorney can provide information on your legal options.
Source: digitalhealth.net, “How will artificial intelligence change radiology?,” Jon Hoeksma, March 01, 2017