‘From taboo to tool’: 30% of GPs in UK use AI tools in patient consultations, study finds

59 minutes ago 2

Almost three in 10 GPs in the UK are using AI tools such as ChatGPT in consultations with patients, even though it could lead to them making mistakes and being sued, a study reveals.

The rapid adoption of AI to ease workloads is happening alongside a “wild west” lack of regulation of the technology, which is leaving GPs unaware which tools are safe to use. That is the conclusion of research by the Nuffield Trust thinktank, based on a survey of 2,108 family doctors by the Royal College of GPs about AI and on focus groups of GPs.

Ministers hope that AI can help reduce the delays patients face in seeing a GP.

The study found that more and more GPs were using AI to produce summaries of appointments with patients, assisting their diagnosis of the patient’s condition and routine administrative tasks.

In all, 598 (28%) of the 2,108 survey respondents said they were already using AI. More male (33%) than female (25%) GPs have used it and far more use it in well-off than in poorer areas.

It is moving quickly into more widespread use. However, large majorities of GPs, whether they use it or not, worry that practices that adopt it could face “professional liability and medico-legal issues”, and “risks of clinical errors” and problems of “patient privacy and data security” as a result, the Nuffield Trust’s report says.

“The government is pinning its hopes on the potential of AI to transform the NHS. But there is a huge chasm between policy ambitions and the current disorganised reality of how AI is being rolled out and used in general practice”, said Dr Becks Fisher, a GP who is the thinktank’s director of research and policy.

“It is very hard for GPs to feel confident about using AI when they’re faced with a wild west of tools which are unregulated at a national level in the NHS”, she added.

While some NHS regional integrated care boards back GPs using AI, others ban it.

In a blow to ministerial hopes, the survey also found that GPs use the time it saves them to recover from the stresses of their busy days rather than to see more patients. “While policymakers hope that this saved time will be used to offer more appointments, GPs reported using it primarily for self-care and rest, including reducing overtime working hours to prevent burnout”, the report adds.

A separate study of how family doctors in the UK are using AI published last month in the journal Digital Health involved similar findings. It found that the proportion using AI had risen from 20% to 25% over the previous year.

“In just 12 months, generative AI has gone from taboo to tool in British medicine”, said Dr Charlotte Blease of Uppsala university in Sweden, the lead author of the research.

Like the Nuffield Trust, she highlighted lack of regulation as a key concern, especially given the speed at which GPs are incorporating AI into their clinical practice. “The real risk isn’t that GPs are using AI. It’s that they’re doing it without training or oversight,” Blease said.

“AI is already being used in everyday medicine. The challenge now is to ensure it’s deployed safely, ethically and openly.”

Growing numbers of patients were also using AI to improve their healthcare, including when they could not get a GP appointment, Healthwatch England said.

“Our recent research shows that while patients continue to trust the NHS for health information, around one in 10 (9%) are using AI tools for information on staying healthy”, said Chris McCann, the patient watchdog’s deputy chief executive.

“There are various reasons people may turn to AI tools, including when they cannot access GP services. However, the quality of the advice from AI tools is inconsistent. For example, one person received advice from an AI tool that confused shingles with Lyme disease.”

A commission launched by the government in September, on how to ensure that AI is used in a safe, effective and properly regulated way, will make recommendations when it reports.

The Department of Health and Social Care was approached for comment.

Read Entire Article
International | Politik|