GPs turn to artificial intelligence to help reduce patient workload

This is the fifth in a six-part series exploring how artificial intelligence is transforming medical research and treatment.
Difficulty getting an appointment with a GP is a common problem in the UK.
Even if the appointment has been confirmed, Doctors face increasing workload Meaning these sessions may be shorter than the doctor or patient would like.
But Dr Deepali Misra-Sharp, a GP partner in Birmingham, has found that AI has taken much of her administrative work off her shoulders, meaning she can focus more on her patients.
Dr. Mirsa-Sharp started using Heidi Health, a free AI-assisted medical transcription tool that listens and transcribes patient appointments, about four months ago and says it has already made a big difference.
“Often when I'm with a patient, I write things down, which can impact the effectiveness of the consultation,” she said. “This means I can now spend all my time looking at the patient and actively listening. This results in a higher quality consultation.”
She says the technology has reduced her workflow, saving “two to three minutes or more per consultation.” She cites other benefits: “It reduces the risk of errors and omissions in my medical records.”
GPs are under intense pressure as their workforce shrinks and patient numbers continue to grow.
One full-time GP currently looks after 2,273 patients, an increase of 17% since September 2015, According to the British Medical Association (BMA).
Could artificial intelligence be the solution to help GPs reduce administrative tasks and alleviate burnout?
Some studies suggest it can. 2019 report Health Education UK estimates that new technologies such as artificial intelligence can save at least one minute per patient, equivalent to 5.7 million hours of GP time.
at the same time, Oxford University research In 2020, we found that 44% of administrative tasks in general practice could now be largely or fully automated, freeing up time to spend with patients.

One company working on this is Denmark's Corti, which has developed artificial intelligence that can listen to medical consultations over the phone or in person and ask follow-up questions, tips, treatment options, and automatically take notes.
Corti said its technology handles around 150,000 patient interactions every day in hospitals, GP surgeries and medical facilities across Europe and the US, totaling around 100 million contacts per year.
“The idea is that doctors can spend more time treating patients,” said Lars Maaløe, co-founder and chief technology officer of Corti. He said the technology can ask questions based on conversations previously heard in other healthcare situations. .
“The AI can access relevant conversations and then it might think that out of 10,000 similar conversations, most of the questions have been asked, but in fact they haven't been asked,” Mr Maaløe said.
“I imagine GPs are doing consultation after consultation so there's little time to consult with colleagues. It's about giving advice to that colleague.”
He also said it can look at historical patient data. “For example, it could ask, did you remember to ask the patient if they still had pain in their right knee?”
But do patients want technology that can listen in and record their conversations?
Mr Maaløe said “the data does not leave the system”. However, he did say it is good practice to inform patients.
“If the patient objects, the doctor can't document it. We rarely see examples like this because the patient can see better documentation.”
Dr. Misra-Sharp said she lets patients know she has a hearing aid to help her take notes. “I haven't met anyone who has a problem with it, but if they did, I wouldn't do it.”

Meanwhile, C the Signs is currently being used by 1,400 GP surgeries across England, with the platform using artificial intelligence to analyze patients' medical records and check for different signs, symptoms and risk factors of cancer, and recommend what actions should be taken.
“It can capture symptoms like coughs, colds, bloating, etc. and basically see within a minute if there's anything relevant in their medical history,” said Dr. Bea Bakshi, CEO and co-founder of C the Signs. General Practice doctor.
The AI is trained on published medical research papers.
“For example, it might say the patient is at risk for pancreatic cancer and would benefit from a pancreatic scan, and the doctor would then decide to refer to those avenues,” Dr. Bakshi said. “It doesn’t diagnose, but it facilitates.”
She said they have conducted more than 400,000 cancer risk assessments in real-world settings, detecting more than 30,000 cancer patients across more than 50 different cancer types.
A BMA report on artificial intelligence published this year found that “artificial intelligence should transform, not replace, healthcare work by automating routine tasks and increasing efficiency”.
Dr Katie Bramall-Stainer, chair of the BMA's UK general practice committee, said in a statement: “We recognize that artificial intelligence has the potential to revolutionize NHS care – but if it is not implemented safely it can Considerable harm can also be caused. Artificial intelligence has biases and errors that can compromise patient privacy, and remains a work in progress.
“While AI can be used to enhance and complement what GPs can offer as another tool in their arsenal, it is not a magic bullet. We can’t wait to see the promise of AI tomorrow to deliver much-needed productivity , consistency and security improvements are needed today.”

Alison Dennis, partner and co-head of the international life sciences team at law firm Taylor Wessing, warned that GPs need to exercise caution when using artificial intelligence.
Ms Dennis said: “There is a risk that generative AI tools will not provide a comprehensive, complete or correct diagnosis or treatment pathway, or even give the wrong diagnosis or treatment pathway, that is, hallucinate or be based on clinically incorrect training data output. Very high.”
“AI tools have been trained on reliable data sets and then fully validated for clinical use – which is almost certainly for a specific clinical use and better suited to clinical practice.”
Professional medical products must be regulated and receive some form of official certification, she said.
“The NHS also wants to ensure that all data entered into the tool remains securely within NHS systems infrastructure and is not absorbed by the tool provider for use as training data without appropriate GDPR in place (General Data Protection Regulation) safeguards are in place. “
For now, for GPs like Misra-Sharp, it changes their jobs. “It allowed me to enjoy consulting again and not feel pressured by time.”