AI, simulation and innovation: Navigating the future of healthcare education

AI and digital simulation are reshaping healthcare education, offering new opportunities for efficiency, training, and patient care. At the Council of Deans of Health’s Digital Summit 2025, experts explored the promise and challenges of AI integration, from regulatory concerns to the ethical implications of its use in clinical decision-making.
On 28th January 2025, the Council of Deans of Health’s Digital Summit 2025 welcomed 130 attendees from across leadership and academic roles within the healthcare sector, to discuss and debate the current and future direction of digital health.
Following a virtual welcome from the Chair of the Science, Innovation and Technology Committee, Chi Onwurah MP, JISC Director of AI, Michael Webb, discussed the rapidly evolving state of AI adoption within education and healthcare, and the need for regulatory and legislative frameworks to keep pace. Webb argued that we are now into the ‘early reality’ stage of AI adoption, and despite numerous teething problems, AI tools are now so embedded within many digital services that people often do not realise they are using them.
A key aspect of the government’s focus on AI has been efficiency, with the Department for Education a major early investor. The Department is currently exploring the possibility of coding the entire national curriculum, estimating that this could increase the accuracy of automated marking from 30 per cent to 90 per cent, saving teachers a substantial amount of time that could be better used elsewhere.
However, as AI moves into mature operational use, and its input into human lives and decision-making processes increases, it will be increasingly critical to reach a consensus regarding its ethical and responsible use, as well as ensuring that those tasked to use it are able to do so safely and ethically. Webb called on leaders and regulators to set clear boundaries to enable safe exploration of AI, and to create cultures that value curiosity, critical thinking, and progressive human development.
Embedding digital transformation in the future health workforce
This panel examined the need to embed digital literacy into healthcare education to create a healthcare workforce equipped to use technology effectively and meet the future needs of the NHS. Professor Natasha Phillips, Founder of Future Nurse, argued that the pace of technological innovation has outstripped that of pedagogical practice, often placing digitally native students ahead of educators in terms of digital capability. Professor Phillips called for action from regulators to address this disparity, ensuring that the future workforce is prepared to deliver digitally led healthcare.
“We need to weave digital transformation into everything we do and pay attention to people and processes; technological transformation can’t happen without people.”
Professor Natasha Phillips, Founder, Future Nurse
Stating that we stand “on the cusp of the fourth industrial revolution”, Professor Sultan Mahmud, Director of Healthcare at BT Group, made the case for a cultural shift at leadership levels to truly embed digital tools and methods. He observed that a key driver of innovation with NHS trusts is often the personal attitude and culture of those in leadership positions, which can vary substantially from person to another, arguing that “board members not knowing anything about health technology can’t be acceptable”.
“The only way is ethics”
Much time was devoted to discussions concerning AI – including the ethical implications of using AI to facilitate and deliver healthcare, alongside its use as an educational tool. Sundeep Watkins, an Education Advisor to the Chartered Society of Physiotherapy, said that AI must be there to supplement and inform, not replace, humans’ clinical and critical judgement. With AI promising to play a critical role in diagnostics, treatment, communication and education, ethical considerations must be at the core of AI’s use and embedded in the way that technology users are taught to ensure that data biases or deficits do not translate to unequal or inequitable care delivery.
“In AI datasets, critical information is often missing – and if you don’t know what’s missing, you don’t know what’s missing.”
David Game, SVP Global Product for Medical Education, Elsevier
Regulatory organisations have started to consider how they might apply the right levels of oversight to this rapidly changing environment, confirmed Jamie Hunt, Head of Education at the Health and Care Professions Council. Paul Stern, a Senior Researcher and Policy Officer at the General Osteopathic Council, reiterated the importance of regulatory oversight of AI to ensure equitable access in education. He added that regulators are now working together with a view to developing a cross-sector regulatory framework for AI’s use in education to reduce regulatory overlap.
AI and associated technologies have the potential to be ubiquitous within simulated medical education and training within the next decade, underscoring the need for effective regulation to render their use safe, effective and equitable. Professor Paula Holt MBE, a Senior Adviser for Nursing at the Nursing and Midwifery Council, explained that for nurses-in-training, 600 of the 2,300 training hours required to register can be completed through simulated training, “allowing students to practice and reflect in a safe, and psychologically safe, environment.” Students like simulated training, added Professor Holt, as they feel it offers an equitable practice environment, and can help them learn to deal with difficult, real-world situations like receiving abuse or racism, or a medical emergency.
Professor Sharon Weldon, Professor of Healthcare Simulation and Workforce Development at the University of Greenwich, argued that simulation could be a key tool for attracting a newer generation of healthcare professionals, saying that “fewer and fewer, especially young people, want to go into healthcare. Simulation and AI are their worlds, and we have to embrace it to attract these people.”
“AI is now being incorporated into simulated practice learning – this will change quickly, but the driving fundamentals need to be embedded.”
Professor Sharon Weldon, Professor of Healthcare Simulation and Workforce Development, University of Greenwich
Professor Weldon confirmed that in the US, simulated training has reduced the length of training programmes for private nursing students by up to one-third on some cases – something that could be key for workforce pipeline acceleration globally. Simulated training is now being mandated across all nursing training in India, but Professor Weldon argued the need to work collaboratively with industry partners to ensure that these tools truly add value to a medical education.
The final session saw of the day saw NHS England’s National Chief Nursing Information Officer, Helen Balsdon, join National Chief AHP Information Officer, Prabha Vijayakumar, for an audience Q&A. While both were optimistic that innovation will lead to great strides in predictive analytics, prevention and reducing health inequalities, both cautioned that major progress remains difficult without the fundamental basics of data infrastructure and education in place.
“Good technology is one thing, but too much of implementation focuses on the technology and not on people, and then we wonder why implementation is so poor.”
Helen Balsdon, National Chief Nursing Information Officer, NHS England
“Nurses and midwives collect the most data,” said Balsdon, “but we don’t really harness it. We know we’ve got a shortage of nurses, and we need to work differently to address this – digital can help.”
Critical to this is bringing education and practice close together – in simple terms, to ensure that new entrants into the workforce are equipped with the confidence and minimum foundational understanding needed to use technology effectively.
The overriding note from the Digital Health Summit was optimism that AI and associated technologies offer an unprecedented opportunity to transform healthcare delivery and education for all. However, there was evident caution that the pace of technological change has outstripped the ethical, regulatory and legal frameworks that govern our use of them, and there is a clear need to address this lag. To truly harness the potential AI in healthcare, and digital transformation more broadly, collaboration between educators, regulators, and industry leaders must remain a priority—ensuring that technology enhances, rather than hinders, the delivery of safe, ethical, and equitable care.
The Council of Deans of Health have released a Performance Report following the conclusion of the 2025 Digital Summit, which can be viewed here.