One of the big things that’s been letting me down is my background. Until now, I’d been using a low-res, low dynamic range cube that I rendered out of Terragen a few years ago, and to make it not look like a painted backdrop I was running it through some fairly suspect code to push the brightness up on it, which was doing things like making any true white parts of the image infinitely bright, not to mention really drawing attention to the fact that I’d been fool enough to save it as a jpeg.
Vizpeople mainly sells cutout people for architectural visualisations, but offer some really good panoramic skies without too many near objects, so one quick panorama-support shader and some tweaking of the sun colour makes things a lot nicer.
But that’s not all. Since I now have a scene that has all its colours in a fairly acceptable range, I can start actually using the HDR values for a wider variety of things (for those who don’t know: HDR images contain colours that run outside the normal black->white range – they’re mapped down to a displayable range for viewing, but their high values mean you can, for instance, distinguish between the brightness of the sun and the brightness of a piece of paper in sunlight).
The first thing I’m doing with them is fixing up the ambient lighting. In the standard Phong equation, you calculate the diffuse and specular contributions from all lights in the scene (in this case just the sun) and then add a constant “ambient” term to stop the shadows being pure black. This is supposed to represent all the other light reflecting/scattering through the scene, but the problem with this is obvious: most of the bounced light hitting upward-facing surfaces should be blue from the sky, and downward faces should be green.
There are a lot of competing techniques for handling global illumination effects, traditionally by pre-calculating light flow around a static scene, or more recently by doing things like adding virtual point lights into the scene each frame, as recently popularised by AMD’s Leo demo (YouTube video). Since I’m stuck at the “basic functionality” stage though, I’m doing something a bit more simple.
First, I’ve added a button to make the engine, when the next frame is rendered, also grab the six faces of a HDR cube map and spit them out in a cross layout.
This cube could be used to render reflections if I had some glossy materials that needed it, but for lighting what I need is the sum of all incoming light for a given surface normal. Unsurprisingly, someone’s made a tool for that – ATi (now AMD) CubeMapGen can perform a 180° cosine filter on an input cube, and can also conveniently save out to a .dds cubemap. I’ve used a 32×32 cube but could probably go smaller.
There are a couple of problems with this technique though. Firstly it adds an extra texture sample instruction to the shader and eats a few kB of memory which, while unlikely to break the bank, only gets worse if you later want to start storing lots of samples over an environment and blending between them as you move around. Luckily, since this image is so low-frequency, you can store it as a handful of weight values instead and see very little difference. One popular approach is to use a set of Spherical Harmonic (pdf) weights, though I decided to go for the cheaper option of using Ambient Cube (pdf) lighting. Ambient Cube essentially treats all environmental light as a set of six directional lights oriented along the cardinal world axes. The upside to this is that it’s very cheap to calculate: for each of the X, Y and Z components of the world-space surface normal you use the sign to choose between two colours, and multiply by the square of its value. The downside is that it loses some detail around the corners and edges of the cube, as well as producing subtly different values if the scene is rotated to a different angle before capture (luckily that doesn’t happen).
In this scene, it’s actually quite hard to distinguish between the three effects, so here they are in sequence – note the colour change on the underside of the sphere and the bluish cast on the mountains, but also how the six-directional ambient makes the sphere look subtly wrong.
Here they are again with the original cube capture replaced with a bunch of bright, saturated colours:
The second problem with this kind of technique is a little harder to handle. As mentioned above, when I blur the cube map I’m multiplying in the assumed surface response for light at each incoming angle, in this case the cosine of the angle. This only holds up as long as I keep using materials that respond in such a way. To handle materials that use a different lighting function entirely (eg. rough surfaces) I would need a parallel version of any of these methods, calculated using that function instead.
For anything that’s even slightly shiny you actually want a value that changes depending on your view angle, which ruins basically all of this. Unfortunately, everything is shiny (even things like cloth that are usually used as an example of what’s not shiny). The solutions to this are all over the place:
- Try not to worry about it: An old favourite which also applies to the problem with different lighting functions, and almost anything else in rendering.
- Some deeply complex stuff that Tri-Ace (pptx, but here’s a YouTube video of them doing that and a pile of other pretty things) have been doing lets all materials get specular reflections from all light in the room. The price they pay seems to be that they then can’t support lots of samples around the environment, but I’m certainly intending to have a go at implementing it once my D3D 11 port starts to work.
- Calculate a rim-lighting brightness factor (because that’s where it’s most obvious) and multiply up the brightness of the ambient that you have. It’s technically very wrong since you’re getting more of the light from the side of the object, rather than extra light grazing off it from behind, but it’s often enough to fool the eye.
- That thing with the virtual point lights, potentially, if you can afford to do specular calculations for all of them.
For now, unfortunately, it looks like I’m stuck with number 1.