In this paper, we present a mobile augmented reality application that is based on the acquisition of user-generated content obtained by 3D snapshotting. To take a 3D snapshot of an arbitrary object, a point cloud is reconstructed from multiple photographs taken by a mobile phone. From this, a textured polygon model is computed automatically. Other users can view the 3D object in the environment of their choosing by superimposing it on the live video taken by the cell phone camera. Optical square markers provide the anchor for virtual objects in the scene. To extend the viewable range and to improve overall tracking performance, a novel approach based on pixel flow is used to recover the orientation of the phone. This dual tracking approach also allows for a new single-button user interface metaphor for moving virtual objects in the scene. The Development of the AR viewer was accompanied by user studies and a further summative study evaluates the result, confirming our chosen approach.
«
In this paper, we present a mobile augmented reality application that is based on the acquisition of user-generated content obtained by 3D snapshotting. To take a 3D snapshot of an arbitrary object, a point cloud is reconstructed from multiple photographs taken by a mobile phone. From this, a textured polygon model is computed automatically. Other users can view the 3D object in the environment of their choosing by superimposing it on the live video taken by the cell phone camera. Optical s...
»