Publications are listed here.

Postdoc work at the Living Environment Lab at the Wisconsin Institute for Discovery in Madison

Fire hazards in the operation room

Fire and extinguisher

Fire response training in the operation room.

3D registration of SPIM data

Drosophila embryo

Human-in-the-loop registration of SPIM microscopy volumes.

Simulated Mosquito Exposure

Point cloud of desk

Virtual exposure simulation of mosquito bites using the Oculus Rift and the Leap. A custom controller was built and placed inside a real mosquito repellent can as a novel input device.

Watch on youtube »

Home3D -- Virtualized Homes

Point cloud of an apartment

Integrating point cloud representation and rendering into electronic health records.

Demo »

VR Motorcycle Maintenance

Youtube video

A quick prototype to look into training, teaching and Vive/Unity integration

Watch Video »

Pointclouds from images

Point cloud of desk

Photogrammetry for point cloud reconstruction from image data sets.

HTML5 pointcloud rendering

Point cloud of Taliesin in a browser

Point cloud rendering of large data sets in a browser.

Details »

The Advanced Visualization Space

A mockup of the tracking space

A large tracking space for immersive VR visualizations and training of future nurses.

Point Cloud Compression

Motion groups in a snow ball

Compressing time-variant point cloud data by finding and describing similar motion groups over consecutive frames.

Radial Book Images

A photo of the back of a book showing Faulkner

Printing on the sides of book.

Details »

Visualization of Cluster Logs

A view into the university's cluster

A small project to visualize usage patterns of the university's high-throughput computer systems in an immersive VR environment -- a 6-sided CAVE.

Details »

Immersive Virtual Skiing Experience

A participant skiing in a six-sided CAVE

A participant of 2014's Wisconsin Science Fest virtually skiing inside our six-sided CAVE.

Point-based Rendering

Point cloud of desk

Point cloud rendering of large data sets.

Details »

PhD studies at the Wearable Computer Lab in Adelaide

Using the Oculus to investigate chronic neck pain

Neckpain study

This was a collaborative effort with Ross Smith and I providing the hardware and programming and Dan Harvie and Lorimer Moseley from the Body in Mind lab at UniSA. The idea was to use the Oculus Rift with its very immersive wide field-of-view to investigate where and how pain is learned and memorised (In the muscles or the brain). To do so a method similar to redirected walking techniques is employed: the gain of participant’s head rotation is multiplied by a constant factor and the onset of pain is measured.

Rendering Techniques for SAR

Image 13

My PhD thesis dealt with different rendering techniques for Spatial AR and what has to be changed so they can be implemented successfully in SAR.

Adapting Ray Tracing to SAR

Image 14

Another project involved adapting ray tracing to Spatial AR. I used Nvidia’s Optix framework to do implement a real-time capable ray tracer. Again, the user is tracked using an Optitrack IR tracking system with a rigid body marker attached to a frame of those cheap cinema glasses with the lenses removed. Effects implemented were reflection and refractions.

View-dependent Rendering in SAR

Image 15

By using head-tracking, the we can calculate a second, perspective correct, ‘virtual’ view of the scene and using projective texturing painting the surface of the prop. The end effect is similar to having cut holes into the solid surface. One application of this technique is purely virtual details on a physical prop. In the best (and most extreme) case, the physical prop is purely the bounding volume of the virtual object, while everything is displayed using View-dependent rendering techniques.

Irradiance Volumes in SAR

Radiance Volume in SAR

I tried using radiance volumes to simulate the lighting conditions and influence of different projectors on one object in SAR. If the end result was close enough to reality, it could be used to compensate for projector brightness and used for blending. Unfortunately, it did not work out.

Adaptive Substrates for SAR

Image 16

Projected images lack contrast and sharpness/local resolution. We tried to overcome these problems by creating a hybrid combined display by projecting onto an eInk display.

Undergrad studies in Koblenz

Photorealistic Rendering in SAR

Relief mapped SAR box

My Diplomarbeit. I implemented a ton of current-gen (at the time) game-engine features in the custom graphcis engine of our Spatial AR framework. Implemented features included HDR rendering, deferred shading, and relief mapping

Demoscene as an art form

FR-25 The Popular Demo

Demos are great and this study project aimed to differentiate them further than call them 'just' music videos with game engine tech. A focus of this project was to document and classify different types and techniques in demos and therefore enable a common stylistic language (independent of the purely technical terms) to describe and compare different demos.

Using the Nintendo DS a VR input device


This was way before easy-to-program Android tablets were common (Android had not even officially been announced). The DS had a wide range of interesting input options, including a homebrew accelerometer. During this project the DS was controlling an OpenSceneGraph application (with physics) over the wifi network. All the software was written using the homebrew devkitpro toolchain.

Realtime Hybrid NURBS Ray Tracing


This was a one-semester study project which resulted in an accepted paper about the method we developed. We were working on Augenblick, a realtime raytracer that matured into a commercial application (Numenus’ RenderGin). The idea of this paper was that we could rasterize 32-bit pointers of the NURBS surfaces and use the rasterized result for a lookup for further (more expensive) NURBS intersection calculatuions.