Dynamic and deformable models for surgical navigation

This research explores the potential of Extended Reality (XR) to transform surgical guidance, planning and training through the development of dynamic and deformable models. By providing real-time, patient-specific visualisations that adapt to physiological and anatomical changes, these models aim to overcome current limitations in surgical imaging and navigation. Designed to respond to changing conditions within the human body, they provide an adaptive, immersive experience that enhances spatial awareness and supports complex surgical tasks.
This exploration of XR technology expands its applications across multiple stages of surgery. In surgical planning, the models facilitate a detailed understanding of patient-specific anatomy, enabling more precise and tailored preparation for surgery. During surgery, the integration of augmented reality overlays allows for interactive, real-time visualisation directly in the surgical field, enhancing intra-operative navigation and decision-making.

In addition, the realistic nature of these dynamic models positions them as valuable tools for surgical training. By simulating live anatomical responses, they provide an immersive, interactive environment for surgeons to practice techniques and gain familiarity with complex procedures. This research highlights the potential of XR to promote safer, more effective and accessible surgical practices by providing an innovative platform for planning, guidance and hands-on training.