Real-Time 3D Organ Tracking with Depth-Based Augmented Reality for Minimally Invasive Surgery

Authors

  • David R. Campbell Department of Earth and Atmospheric Sciences, University of Alberta, Edmonton, AB T6G 2E3 Author
  • Catherine M. Foster Department of Earth and Atmospheric Sciences, University of Alberta, Edmonton, AB T6G 2E3 Author
  • M. Foster Department of Earth and Atmospheric Sciences, University of Alberta, Edmonton, AB T6G 2E3, Canada Author
  • Laura J. Bennett Department of Earth and Atmospheric Sciences, University of Alberta, Edmonton, AB T6G 2E3, Canada Author
  • Ahmed Khan Department of Geography, McGill University, Montreal, QC H3A 0B9, Canada Author
  • Jing Li Department of Geography, McGill University, Montreal, QC H3A 0B9, Canada Author

DOI:

https://doi.org/10.71465/mrcis148

Keywords:

organ tracking, Kalman prediction, GNN surface modeling, laparoscopic AR, depth sensing

Abstract

Tracking  deformable  organs  during  minimally  invasive  surgery  is  challenging  due  to dynamic  tissue  motion  and  occlusion.  We  propose  a  depth-based  AR  tracking   system  that integrates point cloud alignment with Kalman motion prediction and graph neural network (GNN) surface modeling. The method continuously updates 3D organ meshes, correcting for non-rigid deformations. Tested on 12 laparoscopic liver datasets, our system achieved 0.9 mm RMS tracking error, maintaining  28  fps  on  RTX  3080  hardware.  Compared  with  optical  tracking,  accuracy improved by 22%, while latency was reduced by 35 ms.  Surgeon evaluations confirmed more stable guidance during simulated resections.

Downloads

Published

2025-12-01