Information in hybrid user interfaces can be spread over a variety of different, but complementary, displays, with which users interact through a potentially equally varied range of interaction devices. Since the exact configuration of these displays and devices may not be known in advance, it is desirable for users to be able to reconfigure at runtime the data flow between interaction devices and objects on the displays. To make this possible, we present the design and implementation of a prototype mixed reality system that allows users to immersively reconfigure a running hybrid user interface.
«
Information in hybrid user interfaces can be spread over a variety of different, but complementary, displays, with which users interact through a potentially equally varied range of interaction devices. Since the exact configuration of these displays and devices may not be known in advance, it is desirable for users to be able to reconfigure at runtime the data flow between interaction devices and objects on the displays. To make this possible, we present the design and implementation of...
»