In the evolution of virtual reality environments and computer games we observe an ever-growing demand for realistic effects such as swashing water, rising smoke, dancing fire, explosions or caustics. To be used in such environments, interactive simulation and rendering techniques have to be realized on consumer class PCs.
This thesis aims at exploiting graphics processing units (GPUs) for this purpose. We target the GPU as a numeric coprocessor for a couple of reasons. First, aside from their current utility, GPUs present the first commercially successful examples of a class of future computing architectures key to high-performance, cost-effective supercomputing. Mainly driven by computer games, GPUs have now evolved to a fully programmable and highly parallel vector processor. Second, as the simulation is carried out on the GPU, simulation results are already where they are needed for display---in local GPU memory. This eliminates CPU-GPU data transfer, which is likely to become the bottleneck otherwise.
To be able to implement general techniques of numerical computing on the GPU, a GPU abstraction for this kind of application is required. Therefore, we have developed a linear algebra framework. This framework allows the programmer to abstract from the underlying GPU data structures and algorithms, and to focus on the application itself rather than the GPU implementation. Based on this framework we have implemented efficient algorithms for solving large systems of linear equations, and we have used these algorithms to solve partial differential equations such as the wave equation or the Navier-Stokes equations on GPUs.
The proposed framework facilitates the use of numerical simulation techniques to drive real-time visual effects. By using simulation results to advect geometric primitives on the GPU, saving displaced positions in graphics memory, and then sending these positions through the GPU again to obtain images in the frame buffer, a variety of different effects can be generated. In combination with particle primitives and 3D textures, for the first time ever is it now possible to simulate and render dynamic 3D effects in real time on consumer class PCs.
In addition to physics-based simulation of fluid phenomena, in this thesis we also present new techniques to simulate optical effects caused by such phenomena; i.e., caustics that can appear whenever light impinges upon reflecting or transmitting material. The proposed techniques neither require any pre-processing nor an intermediate radiance representation, and they can thus deal efficiently with dynamic scenery and scenery that is modified, or even created on the GPU.
«
In the evolution of virtual reality environments and computer games we observe an ever-growing demand for realistic effects such as swashing water, rising smoke, dancing fire, explosions or caustics. To be used in such environments, interactive simulation and rendering techniques have to be realized on consumer class PCs.
This thesis aims at exploiting graphics processing units (GPUs) for this purpose. We target the GPU as a numeric coprocessor for a couple of reasons. First, aside from their...
»