Labor market models suggest that the impact of new technologies will be felt differently across industries and occupations. A 2021 PwC report predicted the risk of jobs being displaced in health and social care due to “artificial intelligence and related technologies” would be lower than that in many other sectors. In fact, when placed against the backdrop of escalating patient demand, the report predicted that health and social care would see the largest net increases in employment of any sector over the next 20 years, with technology largely proving “complementary”.
There are several potential factors behind this more positive outlook – including factors that underpin the nature of healthcare work itself.
Technology struggles to replicate attributes or efficiencies in healthcare
First, many tasks in healthcare are difficult to automate because they require traits or competencies that AI and other technologies currently struggle to imitate. For example, recent research by Open AI, Open Research, and the University of Pennsylvania (2023) found that jobs requiring critical thinking skills are less likely to be affected by current large-scale language models. Critical thinking is key in much of healthcare, as staff must weigh the benefits and risks of different possibilities, approaches and solutions. For example, important nuances may be needed in translating a patient's symptoms into diagnoses and treatments. While AI such as large language models can aid critical thinking – for example by supporting doctors' education, training and professional development or by mining large amounts of research to generate health advice – this is different from actually doing critical thinking. Other key competencies needed in healthcare, such as creativity and negotiation skills, are also difficult to automate.
Social and emotional intelligence are also essential components of high-quality care, enabling staff to empathize, communicate effectively, and meet patients' needs. Analysis by the Office for National Statistics (ONS) in 2019 – which found that medical practitioners were one of the three professions least at risk of automation – indicated that health-related words such as “patient” and “treatment” appeared more frequently in the task. Job descriptions with low risk of automation. The Office for National Statistics suggested that this reflects the dimension of “working with people” and “the added value that humans add in these roles, which is difficult to computerise”. Again, emerging research suggests that AI can support empathic communication — for example, by generating draft responses to patients' questions — but this is different from empathy, which requires the ability to read and understand the feelings of others, and to express and reason with them. Feelings.
Health care is viewed as inherently “humanitarian.”
The second factor is that in the UK, as in many cultures, healthcare is seen as intrinsically “humanitarian”. Given the great value attached to the personal dimension of care, some activities – such as communicating a diagnosis of a serious illness or comforting a patient – cannot be delegated to machines without undermining the quality and spirit of care. To take another example, while some patients may be happy with AI making clinical decisions in areas such as triage, others may feel that a human listening to and considering their condition is an important component of treating them with respect and compassion. Healthcare is not a product, but a service co-designed by professionals and patients and built on trust. Human relations are therefore particularly important in areas such as care planning, where the need for true partnership may impose important constraints on the use of automation.
A 2021 Oxford University study not only looked at which healthcare activities could be automated, but also looked at what healthcare practitioners thought about the desirability of automating them. Interestingly, many activities were ranked high in terms of potential for automation but low in terms of desirability of automation – typically including those activities that involve a “high level of physical contact” with patients (such as administering anesthesia or examining the mouth and teeth). ). Many healthcare tasks lie at the intersection of meeting a patient's physical, mental, and social needs, and this is likely to influence attitudes toward automation.
Even when a task can be automated, it doesn't necessarily mean it has to be. In the study, conducted by Open AI, Open Research, and the University of Pennsylvania, researchers note the difficulty of making predictions about the impact of large language models on activities in which “there is currently some organization or rule that requires or suggests human oversight, judgment, or empathy.” '- a description that characterizes much of health care.
Some healthcare roles consist of tasks that can be fully automated
A third reason for the low risk of widespread job displacement in health care is that automation applies to tasks rather than roles per se, and that few health care professions consist entirely of automatable tasks. A Health Foundation-funded study into the potential for automation in primary care found that although there are a small number of roles (such as prescription clerks) that are likely to be severely affected by automation, no single profession can be completely automated.
When only specific tasks can be automated, employees can adapt by focusing on other tasks or expanding their roles. Research by Goldman Sachs (2023) on the exposure of different industries to automation and generative AI predicts that occupational categories such as “healthcare support” and “healthcare practitioners and technicians” will largely be supplemented rather than replaced by AI – precisely because of the mix. of the tasks involved. Likewise, research by Accenture (2023) indicated that, compared to many other industries, a smaller share of health work has high potential for automation, but a larger share has high potential to be “augmented” by technology.