To put it very simplistically you show the network a large number of examples and then ask it to generate new content. A naive extension of this within Radiology is that you show the network a large PACS database with images and associated reports, and then ask it to write new reports.
At some point in the future, these systems will be better than humans and will not need supervision. But how far off in the future? It only takes a small mistake to cause significant risk in healthcare, so I believe it will be a long time before these systems are good enough to be autonomous. However, who would train for a profession that is definitely being automated. And here’s the rub, who would train for a profession that has to be automated because no-one is training to do it.
I think one of the key issues at the moment is to create a counter argument that is more realistic about the short to mid-term potential of AI systems that clearly states that the radiologist will still be in the driving seat for some time to come. And that AI system will simply make their working lives easier and less stressful through the use of blended workflows.
I should state that my research work involves the development of situated radiology workflow evaluation and modeling techniques in order to assess the value of blended workflows. I’m currently looking for funding if there any interested PACS vendors out there. (Sorry for the blatant plug)
Simon, a very reasonable statement. There is an atmosphere approaching hysteria regarding AI in radiology at the moment and like you, whilst I do not doubt its potential for a minute, I think some cooler, perhaps more pragmatic thinking is required. The NHS is desperate to squeeze AI companies for all they can get through selling our images to train their algorithms (I have my doubts, they'll just go elsewhere) and the College is equally keen to regulate its use, not unreasonably (https://www.rcr.ac.uk/clinical-radiology/service-delivery/artificial-intelligenc e) although it remains to be seen how effective this will be; rather like outsourcing I suspect cash-strapped Trusts will go for what they find useful and cost effective rather than something that has an RCR stamp on the side, especially if it already has a CE or FDA mark. Regarding timelines, it's hard to predict when this will happen. AI has been around since the '50s and neural networks since the '80s but it's only since GPUs came to the fore in 2012 - the ones that aren't being used to mine Bitcoin that is - and TPUs etc. that the likes of DeepMind have been able to thrash Grandmasters at Go. Interestingly (?) when I visited their HQ in St. Pancras before Christmas and spoke to their medical lead he was at pains not to be drawn about when AI would significantly penetrate the UK health market (radiology included) saying more than once there was still a long way to go. In short, I agree that AI will definitely bring great benefits to the practice of Radiology and that it will provide support for the routine 'heavy lifting' of imaging departments: appendicular plain film work, probably the CXR and a good first stab at CT, MR and Nuclear Medicine, not to mention scheduling, vetting etc. but will radiologists still be required? Absolutely! Will it solve the work force crisis? Hopefully. Will it free us of (some of) the base load and free us up to do the stuff we enjoy the most - interacting with our physician and surgical colleagues at MDTs etc and reporting the tricky stuff (with decision support!), I hope so too.
There is a whole other debate about AI accountability and what the public will come to expect over time but that's for another day...
posted on Thursday, February 01, 2018 - 12:15 pm
AI will come to radiology for 2 very good reasons:
1. We need it - everyone is struggling with reporting capacity
2. There are no other good options to achieve this.
How far with this technology go? 1. assisting reporters with their work? - yes, we're already seeing that with lung nodule analysis, colons/ ?breast etc
2. work completely autonomously (producing reports with no human intervention/ oversight)?. I think that will come but I don't know anyone brave/ wise enough to give a timescale. If any technical/ political/ ethical issues can be overcome, it feels like a natural progression of the technology.
Watching this space with great interest...
posted on Saturday, February 03, 2018 - 09:41 am
I agree with you all Simon, Nick & John. I would love for more AI adoption in NHS particularly for Computer Vision based detection and decision support. However, one needs to understand that as humans are imperfect, so will computer AI generated reports.
Training of radiologists in the future will need to taught to question the AI generated/detected abnormality/report. Radiologists will need to take AI generated “reports” in a same way as when they ask a colleague for an opinion. They may choose to accept or have a valid reason to ignore it. By decision support I mean “lung nodule on CT likely to be benign” due to calcification within it. However, AI generated reports will be detection with maybe a tentative diagnosis. However, what it will lack is advice on the next step-e.g referral to lung MDT or respiratory clinic appointment if it is a “new suspicious” lung nodule.
Adoption of AI in NHS requires managing expectations of public, politicians, NHS managers and also radiologists. Training of radiologists will require them to question AI generated 1st opinion rather than follow it blindly.
PACS vendors have huge amounts of image data and associated radiology reports. Computer engineers can easily use these datasets for generating AI algorithms using Computer Vision and natural language processing.
We need to reduce the barrier for entry of computer aided detection and decision support. One classic analogy is even since 1995 I have seen computer aided diagnosis appearing on ECG printouts. As a medic on the wards, I knew of the huge limitations of this. However, it was a huge triage tool for the nurses. The same kind of attitude needs to be applied for radiology CAD. If we expect 100% accuracy and someone/something to blame, we will prevent its adoption.
posted on Saturday, February 03, 2018 - 10:10 am
One of my major concerns is that vendors will pursue “sexy” implementation of AI - seeking to implement (and thus research/purchase/R&D) those aspects that can market well. And they will all chase that niche market.
There are so many facets of what we do that could benefit - from the workflows as elegantly described above, to sophisticated voice recognition and reporting methologies.
As an example - my windows phone with Cortana before I was lured to the Dark Side could hear and transcribe my speech in a noisy moving car with high fidelity. Nuance should pay attention as Dragon in a quiet office with the mic next to my mouth still glitches out.