AI has been used in healthcare for a while now, often in ways that patients don’t see. For example, AI is used to optimize a hospital’s workforce and inventory, making sure the facility is properly staffed and supplied. However, McKinsey research reveals that the healthcare field is behind most other industries in adopting AI. It’s a very fragmented industry, with tons of hospitals. And the systems that have been used in those hospitals have tended to be bespoke, which makes it difficult to copy innovations specific to one care setting and paste them to another. Another issue has been data, which has been siloed within hospital walls and has not been transferable to other hospitals.
But generative AI (gen AI) changes several dynamics. Large language models are applicable for many, though not all, use cases. This is a paradigm shift because it allows developers to focus on making an existing algorithm work as opposed to creating the algorithm in the first place. If most of your time is spent getting an off-the-shelf algorithm to work better, you can turn your attention to the quality of your data, getting more data, and making sure that users adopt gen AI. I think that’s where we’re seeing a lot of focus, which is pretty exciting in being able to unlock value.
Here’s one example. In hospitals, data is contained in electronic medical records. This might be data from devices like implants or scans like MRIs and ultrasounds, or it might be data about prior encounters that patients have had in other hospital systems. There are third parties thinking about how we can anonymize, or “tokenize,” all this data so that it can be shared across hospital walls. One use would be to improve clinical trials by making it easier to find the patients who would be most likely to benefit from a particular trial. My prediction is that we’re going to go beyond improving operational tasks and start seeing gen AI used to support clinical decisions, which, up to now, we haven’t seen so much. That said, it will be essential to keep a human in the loop, since the technology can “hallucinate” and will need to be monitored.
|