Have you heard all the rage in AI? ChatGPT was making the rounds last few weeks as it was open to the public to play around with. The website is pretty cool, but it did take me a few days to get on it to play around due to traffic. For last week’s blog post I decided to hop on the bandwagon and see if Artificial Intelligence could replace a content writer’s Human Intelligence (insert sarcastic comment). Don’t tell the boss I did this though, AI doesn’t have bills to pay. Seriously though, the other blog post last week was written by ChatGPT.
Whether AI, ML, or other letter combination when it comes to advanced computing, there is no doubt that language processing will find it’s way more and more into healthcare. How do I know this? Because AI told me so, go read the other post!
Integrating AI effectively into the clinical space can help nursing in a variety of ways. It can be used to aggregate information on a single patient about their current visit and past visits to help guide interventions. It can also be used to effectively aggregate patient data from large populations to help study the effectiveness of new interventions vs past interventions.
Maybe we use the AI to write custom discharge instructions that are patient specific and relevant to their care? Starting off with a basic template this can be really powerful to drive better engagement and adherence to instructions. Current discharge instructions are often pages long with information dispersed throughout a large packet, providing a succinct summary can make it much easier for patients.
ChatGPT was initially devised as a chat bot, often rudimentary forms of chatbots can be found for patient intake for medical practices, urgent care, or to guide patients toward specialists. If we can better aggregate these sessions into a basic medical history or offer guidance toward potential diagnosis, then there can be significant time savings just on the nursing triage.
Pretty cool stuff that has the potential to really improve healthcare by helping the clinician and the patient alike.
With integrating any computer process automation there are several concerns that arise.
What is the reference source material? In the case of ChatGPT it uses large databases of published material, this can include unverified sources or opinion pieces. With any AI/ML, knowing the referenced source will be important.
Is the output relevant? AI can be confused, just like me, but we often use slang terms that AI may not recognize, this can lead to some weird output. The output definitely needs to be reviewed by professionals. In the other post there are number of inaccuracies when it comes to the semantics, but they do matter.
How will it be integrated? Is it clear to patients when they are interacting with AI vs. a real human? If not it can be upsetting and cause patients to lose faith in the care system.
How much do we rely on AI? FDA regulations are starting to address this in terms of decision support, but how far AI goes into making patient care and treatment decisions that may fall into a gray area that can be difficult to navigate and ultimately requires a trained professional to adjudicate.
Tido Inc can help your organization answer these questions and help to integrate or develop applications so your organization can effectively leverage AI and ML to help improve clinical workflows and experience. Now let’s see how much Holiday shopping has messed with my Amazon algorithms!