AI in healthcare: Cranky, creepy or to be welcomed?

What is the future for artificial intelligence (AI) in health and social care? A new report by Nesta, debated at this week’s ‘People Powered Health’ event at the Brewery in London, seeks to come up with some answers

This seems like a tipping point with AI. While already in use in the health sector, the much larger role that is predicted could go one of two ways. It could improve dialogue with the patient or it could become a barrier. As with all technology, whether it is a force for good or bad is likely to come down to the implementation and governance.

Around a patient, there is a whole group of support people – “if we put a computer in there, what is its role,” asked design strategist at the Mayo Clinic Centre for Innovation, Barbara Barry.

Where AI is proving useful at present is in narrow applications, such as analysing data or images and providing higher accuracy diagnosis than clinicians can achieve. However, the wider questions about its use will come as it moves increasingly between the patient and the services. It could be more of a barrier than an open door, the report warns, offering opaque advice and dehumanising healthcare.

It could be a really inhumane barrier between a person and the care they need,” said report co-author and Nesta’s Health Lab head of strategy, John Loder. There is no expectation that AI will take the place of doctors, who need a wide range of skills backed up by trust to make holistic judgments. “It is far easier for AI to be adopted where there are no or few good alternatives… than in areas where humans are effective and trusted,” the report concludes.

Rebuilding secondary education for refugees

UCL and Nesta’s Lydia Nicholas. Photo credit: Nesta

Possible application areas

One likely role is to provide advice and triage before seeing a doctor. Between one-quarter and one-fifth of visits to A&E and doctors are unnecessary, which the patient doesn’t necessarily realise in advance, said Loder. As such, AI could constitute a “front door”. Indeed, it is already being used (as are search engines, for that matter) to offer healthcare advice and diagnoses, via diagnostic apps or chatboxes to share symptoms and gain advice on whether or not to seek further medical help.

One risk is that it could worsen access for some people without technical skills or access. There is also the question of accountability, pointed out co-author, UCL and Nesta researcher, Lydia Nicholas, where “you can’t get hold of the data or the reasoning behind it”. AI could also generate additional demand through false positives or risk-averse advice and could introduce errors.

Proactive care could see AI identifying signals from data to provide early warnings, such as monitoring the breathing sounds of people with congestive heart failure to spot signs of deterioration. “Or it could generate a great deal of unnecessary concern, replace individualised conversations with standardised analytics, and generate an oppressive degree of monitoring,” the report warns.

Most tentative of the three suggested application areas, Loder admitted, is to provide an automated second opinion, either purely for the doctor’s use or for patients, which might add value for enabling patients to challenge and advocate for their care but is high-risk if applied to areas that are beyond the technology or are based on weak datasets.

Rebuilding secondary education for refugees

One of the stars of the Nesta event: the Brighton-based New Note orchestra for people in recovery from alcohol and substance addiction

Good and bad uses

In the discussion, Geoff Huggins, the Scottish Government’s director of health and social care integration and eHealth, suggested that producing good clinical summaries and communications felt an appropriate area for lots of processors, big data and iterative learning. However, “we don’t want Uber for the NHS” and we don’t want it privatised, he said.

It could bring heightened anxiety but, by googling their conditions, patients already have expectations before they visit their doctor, he said. Babylon Health, which uses AI, is “quite sophisticated”, he pointed out, and people are proving more willing to engage via chatbox than with a human in some areas, such as substance misuse, sexual health and mental health.

Nicholas cited AI-based mental health advice app, Wysa, as an extremely well-designed service, which feels human. In fact, it proactively pushes the benefits of AI versus human interaction, stating: “Nothing can match the privacy of an anonymous conversation with an AI bot. Think of it as an interactive journal meets life coach. Wysa is good at asking the right probing questions, and helping you untangle and unwind after a hard day.

Wysa also touts the ability to move at your own pace and to offer help at any time – “Wysa is… your 4 am friend for when you have no one to talk to…” It claims 300,000+ users in 30 countries and a user rating of more than 4.5 out of 5, so appears to doing something right with AI.

A part of the debate is certain to be the ever more thorny issue of data, with the need for patients to remain in control and not lose this to the mega-corporations. As discussed at a recent event, bad use of data will stop those who want to put it to good uses, particularly in health care – https://smartercommunities.media/hindering-data-for-good-the-erosion-of-trust/

‘Cranky’ and ‘creepy’ were two of the words used during the discussion to describe AI. AI will become ever more prevalent so it is vital that it is used properly. “Decision-makers have to be equipped to understand the technology,” concluded Nicholas. The health service cannot sit back and wait, it needs to be proactive and there needs to be rigorous and transparent public and clinical scrutiny.

The full report can be downloaded here: http://www.nesta.org.uk/publications/confronting-dr-robot

By |2018-05-04T09:44:36+00:00May 4th, 2018|Society|0 Comments

Leave A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

If You Enjoyed This Post
Join My Newsletter
Subscribe
Give it a try, you can unsubscribe anytime.
Close