You are on page 1of 22

Marker-less

AR in the
Hybrid Opera6ng Room
Fernando Barrera
ICube, University of Strasbourg, CNRS, IHU Strasbourg, France

Research Group CAMMA


Computa(onal Analysis and Modeling of Medical Ac(vi(es
AR in the OR
2

Medical prac11oners are confronted with more and more


data and informa1on during a procedure:
Imaging data, pa1ent informa1on, surgical instrument tracking
AR permits in-situ visualiza1on of key informa1on related to
the performance of the procedure

Linte et al., On mixed reality environments for minimally


Jolesz et al., Image-guided procedures invasive therapy guidance: Systems architecture,
and the OR of the future, RSNA 1996 successes and challenges in their implementa(on from
laboratory to clinic, Comput Med Imaging Graph 2013
AR in the OR
3

Towards context-aware radia1on protec1on


for the hybrid opera1ng room

Loy Rodas and Padoy, 3D Global Es(ma(on and Augmented Reality Visualiza(on of Intra-opera(ve X-ray Dose, MICCAI 2014
Mobile AR in the OR
4

Clinical applica1on: display 3D informa1on related to


radia1on safety directly in the clinicians view
In-situ feedback about an upcoming X-ray image:
Robustness to mo1on in the scene and occlusions
Impossibility of having mul1ple markers visible from all
X-ray device parameters
interes1ng viewpoints
Pa1ent radia1on exposure
Pre-computed as in [1]
Modeling of the scene beyond the pa1ent is required
3D distribu1on of radia1on

[1] Loy Rodas and Padoy, 3D Global Es(ma(on and Augmented Reality Visualiza(on of Intra-opera(ve X-ray Dose, MICCAI 2014
Related work
5

AR relies on an accurate registra1on of the users point of view


and tracking of the viewing device, currently achieved by:

Tracking markers [1] Registering a mesh to a model [2]


Markers can be intrusive and Changes in the posi(ons of equipment
interfere with the procedure at or clinicians can invalidate any a priori
hand models

[1] Navab et al., First Deployments of Augmented Reality in Opera(ng Rooms, Computer 2012
[2] Kilgus et al., Mobile Markerless augmented reality and its applica(ons in forensic medicine, IJCARS 2015
Method
6

Mul6-view RGBD camera system

2 RGBD cameras rigidly mounted to the ceiling and 1 RGBD camera xed to a
registered to a room global coordinate system hand-held screen
Method
7

Equipment detec6on for tracking ini6aliza6on / camera relocaliza6on

Simultaneous equipment detec1on in at


least 2 views

Natural markers for obtaining an ini1al


registra1on of mobile camera to room

Detec1on on a sta1c camera Detec1on on moving camera


Method
8

Frame-to-frame moving camera tracking with KinectFusion-like* approach

New equipment detec1ons allows fast


relocaliza1on when tracking is lost

TSDF volume Moving device trajectory

* Newcombe et al., Kinect-Fusion:


Real-(me dense surface mapping
and tracking, ISMAR 2011
Equipment detec1on
9

Mul6modal template matching approach


based on LINEMOD*
Challenges:
1. Iden1fy ongoing X-ray projec1on via equipment detec1on
2. Equipment and moving camera in constant mo1on
3. Equipment not necessarily placed over a planar surface
4. Large equipment size: not fully visible and subject to occlusions
*Hinterstoisser et al., Model Based Training, Detec(on and Pose Es(ma(on of Texture-Less 3D Objects in Heavily CluVered Scenes, ACCV 2012
Equipment detec1on
10

Template genera6on

Genera1on of virtual RGBD images to cover for


Camera viewpoints recursively sampled
Extrac1on of color gradients and
each possible camera viewpoint
around the model
surface normals from each pair
Equipment detec1on
11

Needed for clinical applica6on:


Detec1on of the current X-ray device congura1on
3D pose es1ma1on of moving device

Not possible using a single camera: ambigui1es when equipment and camera move

Equipment detec1on also needed in sta1c cameras


Equipment detec1on
12

Needed for clinical applica6on:


Detec1on of the current X-ray device congura1on
3D pose es1ma1on of moving device

Template genera1on procedure is repeated for a set of 3D orienta1ons

Larger database of templates


Equipment detec1on
13

A priori informa6on used to lter improbable views

Template database reduced by 70% by keeping only the relevant templates

Moving camera possible


Moving devices
viewpoints
poten1al areas of mo1on
for our clinical applica1on
Equipment detec1on
14

Part-based detec6on

Equipment detected even when it is not fully visible


Equipment detec1on
15

Dynamic template subset selec6on


Sta1c camera
Detec1on of devices congura1on
in sta1c camera

Search space is reduced:


faster tracking ini6aliza6on
Qualita1ve results
16
Conclusions
17

Mobile Markerless AR system for radia1on awareness

Mul1 RGBD camera system for equipment detec1on in the OR

Non-intrusive approach: natural markers for camera localiza1on


Dynamic selec1on of templates subsets + detec1on of parts permits
to reduce the number of tested templates in the moving view
New applica1on for mobile AR in the OR

Limita1ons:
Radia1on maps not computed in real-1me & pa1ent not tracked
Tracking accuracy aected by errors from:
Mul1-camera system calibra1on
Quality of depth map from RGBD cameras (Asus X1on Pro)
Quality of the equipment detec1on
Perspec1ves
18

Fully context-aware radia6on awareness system with


full 3D OR understanding
Body-part dose es6ma6on

* Kadkhodamohammadi et al., Pictorial Structures on RGB-D images for Human


Pose Es(ma(on in the Opera(ng Room, MICCAI 2015
Thank you !

Acknowledgments

Research Group CAMMA , ICube, University of Strasbourg


hop://camma.u-strasbg.fr

Preliminary quan1ta1ve evalua1on
20

Ground-truth moving camera pose obtained with an op1cal tracking system

X-ray device
0 90 180 270 Mean std
orienta6on
ICP error (mm) 210.9 81.0 222.8 79.8 162.7 51.8 234.9 91.7 207.8 76.0
Rx error 5.1 2.5 3.6 2.7 3.9 1.4 5.3 3.3 4.4 2.4
Ry error 5.1 2.2 10.1 2.0 4.4 1.2 6.7 2.8 6.5 2.0
Rz error 3.4 1.4 9.2 0.5 5.3 1.3 4.2 1.0 5.6 2.5
Mean R error 4.5 1.8 7.6 1.6 4.5 1.0 5.4 2.2 5.5 1.6
T error (mm) 88.2 51.5 103.8 59.2 119.2 25.1 97.8 42.8 100.2 44.6
Moving camera tracking error
21

Radia1on propaga1on simula1on framework


C-arm congura6on and room lay-out Irradia6on parameters
modeling obtained from X-ray device
Tube tension
Filtra1on
Air Kerma
Field-of-view

Simula1on of X-ray Energy spectra

Calibra6on with
Simula6on of Radia6on Propaga6on (Geant4) test dosimeters
X-ray tube: source of photons with energies sampled from
simulated spectrum

Modeling of Compton Scaoering, Rayleigh Scaoering and


Photoelectric Eect Correc6on factor

Intra-opera6ve X-ray Dose values


Loy Rodas and Padoy, 3D Global Es(ma(on and Augmented Reality Visualiza(on of Intra-opera(ve X-ray Dose, MICCAI 2014
Radia1on propaga1on simula1on framework
22

Simula1on of X-ray dose in all room enables to es1mate the deposited


dose over the pa1ents surface

Using a scanned model of a phantom:
We iden1ty the voxel each surface point intersects
We color each point according to the dose in the intersec1on voxel

AR Visualiza1on of pa1ents surface dose for dierent device


angula1ons:

X-ray device projec1on 0 90 180 225