Real-time raytracing

bit-tech.net | Parallel Worlds | by Brett Thomas

As some of you already know, the idea of real-time raytracing has always been one of my pet-peeves for the industry. The concept is easy – rather than trying to approximate every single pixel’s light value through myriad pipelines and shaders, you trace rays of light from eye to source using one physics calculation. This calculation takes lots into account based on what the light hits, but it is just one calculation that is repeated millions and millions of times per frame.

Rather than using uni-directional meshes for models, where only the outside counts as visible space (this is where clipping errors derive from), raytracing deals in volumes. Each time a ray of light hits a new volume, a new segment is created (dubbed a rayseg) for how light would react within (or on) that particular material. Since light is now allowed to pass through transparent objects, or is properly reflected off a solid surface, all light in a room traces back to its sources. It sounds easy enough! If only the implementation were as simple… [Click for more]