AI in Healthcare – news picked by GLI /02

Introducing second edition of “AI in Healthcare – curated news by GLI,” a collection of insightful posts exploring the most recent developments and emerging trends where artificial intelligence intersects with the healthcare industry. Through this series, we aim to present you with a thoughtfully selected blend of news and perspectives, carefully chosen by the Graylight Imaging (GLI) team.
AI in healthcare guidance? Lung cancer case
AI-derived CAD software’s potential in lung cancer screening. The NICE diagnostics guidance AI-derived computer-aided detection (CAD) software for detecting and measuring lung nodules in CT scan images [1] published 5th of July highlights that AI-derived computer-aided detection (CAD) software can be utilized to assist clinicians in finding and measuring lung nodules in CT scans as part of targeted lung cancer screening. AI-enhanced software has the capability to automatically identify and measure lung nodules on CT images, enabling healthcare professionals to make decisions regarding further investigations or surveillance. (If that sounds interesting you may take a look at one of our case studies).
Does it mean that AI in healthcare starts a new chapter? Not necessarily. The guidance advises against using AI-derived CAD software for people having a chest CT scan due to symptoms suggesting they may have lung cancer or for reasons not related to suspected lung cancer. The reason for this caution is the lack of sufficient evidence to recommend the software for such cases, as it may lead to identifying non-cancerous nodules and causing unnecessary anxiety in patients.
The insufficiency of definitive evidence is also reflected in public sentiment. According to the survey from February 2023 [2], a notable portion of Americans express discomfort about the utilization of AI in healthcare. Six out of ten U.S. adults state that they would feel uneasy if their healthcare provider used artificial intelligence for tasks like diagnosing diseases and suggesting treatments. In contrast, a smaller proportion (39%) indicates they would feel comfortable with such AI implementation.

However, this anxiety may be caused also by the other aspect – legal regulations, which is perfectly visible when discussing Chat GPT. We all can see that Large Language Models represent a new frontier in the use of AI in healthcare, necessitating the development of an entirely new regulatory framework to ensure their safety for medical use. These unprecedented challenges and practical expectations discuss Medical Futurist in one of the recent posts [3].
Medical AI and cloud technology?
In the wake of these discussions of AI in healthcare, computing power should also be touched upon. Recent Signify report shows that currently the use of radiology IT on the public cloud is mostly limited to the academic and outpatient sectors [4]. The widespread adoption of the public cloud in the market has not yet reached a tipping point, primarily due to several significant barriers that continue to exist. The report outlines the main challenges that the market faces: patient data policies, storage costs, lack of real-world use cases of public cloud adoption and more. It also explores the role of technology vendors in overcoming these obstacles and how collaboration across the vendor ecosystem can drive the adoption of cloud technology, not only in radiology but also across medical imaging technology.
// Stay tuned for our upcoming posts and prepare to be inspired by the incredible possibilities that AI brings to the world of healthcare.
References:
[1] https://www.nice.org.uk/guidance/dg55
[3] https://medicalfuturist.com/why-and-how-to-regulate-chatgpt-like-large-language-models-in-healthcare