4 potential drawbacks of using ChatGPT in healthcare (and 5 ways it could be useful)

Bdaily Premium

Since the launch of ChatGPT in November last year, the platform now has over 100 million users. Generating 1.8 billion visitors per month, many are referencing the AI phenomenon as the future, but it hasn’t been well received in all industries.

Experts from Healthcare Transformers have warned of the potential dangers of ChatGPT, warning that many may take their health into their own hands or that the platform isn’t ready to take on complex topics such as healthcare.

Here, Simone Edelmann, an expert in future healthcare, breaks down 4 reasons ChatGPT shouldn’t be used in healthcare, and the instances where it may be useful. Read on to find out more…

1. ChatGPT could cause dangerous self-diagnosis

A recent study by Healthcare Transformers revealed that 97 per cent of people used the internet to search for their symptoms at least once before they saw a doctor or healthcare professional. Now experts warn with the introduction of ChatGPT that this shouldn’t be relied on by the public, as it could lead to issues with healthcare.

“Patients want quick and accessible answers and a major issue will be patients going to ChatGPT to get their illnesses diagnosed and asking for treatment recommendations. The problem is that patient symptoms are complex and individual and putting this information into ChatGPT could lead to issues with patient safety and cause panic.

“As the platform is a large language model, the way information is pulled could contain biases and inappropriate responses.”

2. There’s confusion about liability and legal implications regarding responsibility

An overarching problem with ChatGPT is liability and legal implications in terms of which parties would be responsible for addressing these problems with information available, privacy and security issues, whether it is the makers of the platform or the health system directly.

“With the limitations of ChatGPT come several ethical issues that need to be addressed before the AI-conversational tool becomes widely used within healthcare organisations.”

3. There are data privacy and security risks

Many businesses and even countries have started to ban the use of ChatGPT following data leaks and questions about its security.

“Another issue is data privacy and security. Technologists have not yet determined how ChatGPT could handle such sensitive data or usage of the data in the long term which could lead to serious issues in the future.”

4. The tool has too many limitations

Simone noted that “OpenAI, the developers of ChatGPT, have themselves listed several limitations of the tool that could hinder its utilisation in healthcare.”

Listed by ChatGPT the limitations include

How could ChatGPT be safely used in healthcare?

Experts from Healthcare Transformers do, however, see some improvements that could be made by ChatGPT in the healthcare industry that centre mainly around admin.

1. Improving scheduling and answering simple questions

“Now, patients speak with their healthcare providers from home, ChatGPT could help schedule appointments and provide automated responses to frequently asked questions. They could reduce admin for healthcare staff by not needing to answer questions directly, but patients still receive the information they need in a timely manner.”

2. Improving clinical trial management and recruitment

Recruiting patients for clinical trials can be a significant challenge. Approximately 55% of trials are terminated due to a low accrual rate.

“A large reason is the need to screen patients in person. ChatGPT could address these issues by streamlining the recruitment process. More specifically, patients can interface with ChatGPT.”

3. Mental health education

Globally, depression is one of the leading causes of disability, according to the World Health Organisation. ChatGPT is a great resource for mental health education which could provide self-coping strategies, offer information on mental health conditions, and find mental health providers.

4. Medical writing and translation

“When recording patient visits and documentation, medical jargon is used for describing their clinical notes. ChatGPT can help translate this information through simple, conversational language so patients and caregivers can easily understand it which would help reduce workload for physicians.

ChatGPT could also help translate medical data from different languages which would be a huge resource saver.“

5. Reducing patient waiting times

ChatGPT could potentially be used to assess patients quickly, which would be typically done by medical personnel, thus, reducing human errors and at the same time allowing physicians to complete other high-need tasks.11

“ChatGPT has the potential to transform the healthcare system dramatically but ultimately, it is still in its infancy and is prone to several limitations and challenges.

As AI becomes more integrated, healthcare leaders need to become more aware of how these advanced technologies can support their organisations.“

By Matthew Neville, Senior Correspondent, Bdaily

Explore these topics:
#Health #Medical #Technology #National #Premium #Insight #Premium

Making a splash: Phil Groom

“There were times when I felt like I’d hit a roadblock, but Tony was always there, providing pearls of wisdom…

International Women’s Day: 10 successful women secretly dominating their industry

To celebrate International Women’s Day (8 March), Bdaily has worked with creative content agency Triberia to showcase UK businesswomen who…

How can innovation help a business grow in the offshore wind industry?

As part of our 2024 Innovation Week, in association with Innovation SuperNetwork, we hear from Brendon Hayward, joint Managing Director…

Back to Premium Home