Sterility requirements and interventional workflow in the Operating Room (OR) often make it challenging for surgeons to interact with computerised systems. As a solution, we propose a customizable gesture-based interface for the OR. Our solution consists of a gesture recognition technique and an integration platform. Our recognition technique simultaneously detects the type of the gesture (categorical information) and the state that a user holds within the current gesture (spatio-temporal information). Introducing several feature extraction methods this technique performs independent to the human body proportions and Kinect placement. To exploit this gesturing technique in the OR, we additionally introduce an extensible software platform which allows to define context-aware gesturing interface for several intra-operative devices. The behavior of the gesturing interface can be customized using a visual editor.
«
Sterility requirements and interventional workflow in the Operating Room (OR) often make it challenging for surgeons to interact with computerised systems. As a solution, we propose a customizable gesture-based interface for the OR. Our solution consists of a gesture recognition technique and an integration platform. Our recognition technique simultaneously detects the type of the gesture (categorical information) and the state that a user holds within the current gesture (spatio-temporal inform...
»