Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touch-less, gesture-based interaction system for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medical computerized systems. Instead of using cameras for capturing gestures, we rely on a few wireless inertial sensors, placed on the arms of the surgeon, eliminating the dependence on illumination and line-of-sight. A discriminative gesture recognition approach based on kernel regression allows us to simultaneously classify performed gestures and to track the relative spatial pose within each gesture, giving surgeons fine-grained control of continuous parameters. An extensible software architecture enables a dynamic association of learned gestures to arbitrary intra-operative computerized systems. Our experiments illustrate the performance of our approach and encourage its practical applicability.
«
Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touch-less, gesture-based interaction system for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medic...
»