In working with the water shader for my game, I was wondering how I’ll handle reflections. I’m already pretty much at the per-frame time budget limit, so I’m not sure I can handle rendering even a simplified low resolution reflection map; not to mention the limitations this results in for having different water planes (for a flowing stream for instance).

A few weeks ago I was glancing at a river while I was driving by, looking at the reflections of the hill behind the other bank. The water was wavy enough that the objects in the reflection were not identifiable. It was just a dark blob.

I had a sudden thought. What if I were able to record the angle of the terrain above the water around each point of water? Then I could use this in my water shader to know the point at which a reflected ray should transition from “sky” to “terrain”.

Spherical harmonics are a well known technique that is often used for global illumation. To summarize very briefly: a small set of pre-calculated coefficients is stored per vertex which allow us to reconstruct the ambient light hitting the object. These coefficients basically store a map of the amount of light hitting the point from every direction. Reflected/ambient light is generally very low frequency, so this is how these coefficients contain so “much” information.

I decided to try a similar technique with my water. Each vertex holds a small set of coefficients that describe the angle the terrain rises above the water for every direction around that point. This is can be described with coefficients for a fourier series – basically the 2d equivalent of spherical harmonics. These fourier coefficients work well when interpolated between vertices, which is good for us.

We calculate the coefficients for each water vertex offline. This involves the following operations for every vertex:

- Choose
*k*evenly-spaced sample directions around a point. The value of k only affects offline computation, so you can make this as high as necessary to achieve good fidelity. I currently use 13. - For each sample direction, perform a ray march. Advance one height map texel at a time, measuring the terrain angle above the water plane. How far you march depends on how far away from the shore you want accurate reflections. In my sample app I currently use 5 texels. If your game involves a lot of views of water from a low level, you’ll need to use more (more on that later).
- Now we have an function (looping every 2π) representing the terrain height around the point.
- To obtain the fourier coefficients representing this function, we need to integrate the expression for each coefficient. The exact expressions can be found online. I used numerical integration, with a resolution of 400 (e.g. 400 samples of each function). The number used only affects offline calculations.
- I calculate the first 8 coefficients. This number directly affects the quality and performance of the effect. 8 is certainly good enough for my purposes, and I’ll try to go lower.

I stash my coefficients as 16bit floats in my vertex structure (thus taking up 16 bytes per vertex).

In the water shader, I use the reflection vector to determine the angle I’m interested in

float3 reflectionRay =reflect(worldPosition - CameraPosition, normal); float angle =atan2(-reflectionRay.z, -reflectionRay.x) + PI; // This gives us an angle between 0 and 2π, which we can then use to look up the terrain height.

My fourier evaluation function looks like this (t is the angle):

float EvaluateFourier(float t, float4 coefs1, float4 coefs2) { float4 sins; float4 coses; sincos(float4(t, 2 * t, 3 * t, 4 * t), sins, coses); float value = coefs1.r; // a0 value += coefs1.g * coses.r; // a1 value += coefs1.b * sins.r; // b1 value += coefs1.a * coses.g; // a2 value += coefs2.r * sins.g; // b2 value += coefs2.g * coses.b; // a3 value += coefs2.b * sins.b; // b3 value += coefs2.a * coses.a; // a4 return value; }

This gives me the angle I can then compare to angle above the water plane for the reflection ray to know if I should draw sky or reflected terrain. Currently I just use black for reflected terrain. The effect seems somewhat sufficient. If we wanted, we could also store the color of terrain in addition to the height. This would quadruple the amount of data needed however.

## How does it look?

You can look at the photo at the top of this article for an example. Here’s a version of that with the vertex grid drawn in. Each vertex stores 16 bytes of data in my current implementation.

The normal maps used on the water surface help to hide the fact that this reflection is extremely fake. What does it look like without those?

## Performance

This started as an exercise to avoid having to render an expensive reflection map, so it needs to be performant. Unfortunately, this requires a lot of shader instructions to evaluate in my current implementation. The atan2 is about 20 instructions. The HLSL generates 4 scalar sincos instructions which actually take up 8 instruction slots each. In total, it adds about 64 instruction slots to the pixel shader.

I haven’t taken performance measurements yet, but this is probably not suitable. So my next order of business is to find a way to reduce the number of instructions. Since I am doing an atan followed by sines and cosines, I may be able to reduce this some by making some trigonometric substitutions. Or I might look into evaluating the series using the sum of e to the power of imaginary numbers.

Another thing I’ll look into is trying to reduce the number of coefficients.

Additionally, I’ll see if I can store each coefficients in a single byte instead of a 16bit float.

## Conclusions so far

I think this is promising. The visual effects seem suitable for my purpose, which is generally an overhead view. For games with more varied views this may not be as good a choice (or they will need a better ray marching algorithm for reflections on water that is a long distance from the terrain it is reflecting).

It took a while for me to implement, since when trying something like this you are never sure if visual defects are fundamental flaws with the algorithm or just a bug in your code. I spent much of a day thinking it wasn’t going to work out, until I finally fixed all the code bugs and got something reasonable.

One problem with this technique is of course that it only reflects static objects. Terrain, and whatever else you decide to including in your ray marching algorithm.

In addition the reflections are not colored with my implementation, but this should be easy to add and not result in too much performance impact in the shader (though it bloats the vertices). A bigger problem is perhaps that the reflections will not be properly lit. But – how accurate do you need them?

Once/if I get something that results in a more efficient shader, I’ll post a follow up to this article, and maybe a video.

[…] my last post about my water shader, I’ve been making continual tweaks. The next big thing I did was to investigate water optics. […]

[…] talk about various issues I encountered when trying to fit my water shader (discussed previously in part one and part two) into the engine for my game, which uses deferred rendering. Not all of the discussion […]