Seeing Beyond the Scalpel: How Extended Reality Is Transforming Surgery
In surgery, understanding anatomy in three dimensions is everything. For years, I’ve worked at the intersection of medicine and technology to improve how surgeons perceive the human body — not just as images on a screen, but as spatial, living structures we can intuitively explore.
Through extended reality (XR) — encompassing virtual, augmented, and mixed reality — we are now entering an era where surgeons can truly see inside the body before making an incision. This is not science fiction; it’s the next logical evolution in surgical visualization.

The Problem: Two Dimensions in a Three-Dimensional World
Traditional surgical workflows rely on CT or MRI data displayed on 2D screens. Surgeons mentally reconstruct these slices into 3D shapes — a process that demands experience and imagination. But even the best-trained mind can struggle to grasp complex spatial relationships during a live procedure.
Flat displays separate the surgeon’s visual perception from the patient’s real anatomy, forcing constant mental translation. This disconnect can make navigation more difficult, especially in minimally invasive or complex anatomical regions like the liver, pancreas, or pelvis.
The Solution: Building the “Digital Twin” of the Patient
To solve this, I began developing workflows on OsiriX that convert CT and MRI data into accurate, polygon-based 3D models. These models — the patient’s digital twin — can then be visualized in XR headsets or shared through Holoeyes MD, the platform we built at Holoeyes Inc.
This technology enables surgeons to:
- View holographic 3D anatomy overlaid on the real patient during surgery.
- Collaborate remotely through a shared virtual operating space.
- Rotate, scale, and “walk through” patient anatomy before entering the OR.
- Train residents and students with immersive, patient-specific simulations.
Instead of imagining structures in three dimensions, we now experience them directly — from every angle and depth.
In Practice: Seeing Through the Skin
During a liver resection, for example, XR visualization allows me to see where vessels and tumors are in real space. I can align the holographic liver model with the patient’s body, confirming surgical margins and vessel branches before cutting.
This method helps reduce uncertainty, improves precision, and strengthens teamwork. Everyone in the operating room — assistants, nurses, anesthesiologists — shares the same 3D understanding of the anatomy.
Moreover, our Holoeyes MD system allows multi-user sessions, meaning surgeons across different hospitals can meet “inside” the same patient’s model, planning and rehearsing complex surgeries collaboratively.
The Results: Clearer Vision, Safer Hands
We’ve observed significant improvements when using XR-assisted planning and navigation:
- Improved spatial understanding of anatomy before surgery
- Faster intraoperative decision-making and better orientation
- Enhanced communication within the surgical team
- More intuitive training for residents and medical students
In a field where every millimeter counts, these enhancements translate directly into better patient outcomes and safer procedures.
The Future: The Spatial Era of Surgery
I believe we are entering the spatial computing era of surgery — where data is no longer confined to screens, but lives around us, in the same space as the patient. The surgeon of the near future will operate with holographic overlays, 3D patient models, and remote collaborative environments as standard tools of the trade.
The goal is not to replace the surgeon’s skill, but to augment human perception — to extend our eyes, our minds, and our reach.
Conclusion
As both a surgeon and technologist, I’ve witnessed firsthand how immersive visualization transforms not only operations but also understanding. My journey into surgical extended reality began with OsiriX — the software that made advanced medical image visualization accessible, interactive, and precise.
By using OsiriX to convert CT and MRI data into detailed 3D anatomical reconstructions, I was able to prototype and validate workflows that ultimately evolved into the XR systems we use today at Holoeyes. OsiriX provided the flexible, powerful foundation that allowed me to experiment, learn, and push the boundaries of how surgeons interact with imaging data.
For that, I owe deep gratitude — thank you, OsiriX.
The boundary between the real and the virtual is now dissolving, and the result is a clearer, safer, and more collaborative surgical world. What once existed only in imagination can now be seen — in three dimensions, in real time, inside the human body itself.