My “work in progress” game takes place mostly outdoors, and contains a significant amount of rivers and streams. One effect I definitely wanted was to show the water flowing realistically.
Several months ago I was working on this, but didn’t get anywhere useful. My implementation basically consisted of a triangle strip that was used for rendering the stream water. The water textures scrolled along the triangle strip. It worked fine for streams that were mostly straight, but resulted in significant distortion due to the coarse triangulation I was using. It was much worse with movement. Some improvement could be expected with a more granular triangulation, but there were additional complexities such as handling merging and splitting streams.
I tried to research how other games implemented this, but I must have been looking in the wrong places, since it wasn’t until a few days ago that I found relevant articles while investigating something else.
The fundamental problem in modelling stream flow is that it isn’t possible to scroll the texture coordinates by different amounts (accounting for differing flow speeds) and still retain texture integrity. In my old system, I was planning to address by having some sort of blended overlapping tiles where each tile scrolled at a different speed.
Then I came across a couple of papers by Valve and Naughty Dog. The results looked really nice. They were using flow maps and fading in and out between different textures, which allowed for a reasonable amount of texture stretching/shrinking. This was all done per pixel too, which is very nice – this lets you have unique flow direction and flow speed for every pixel (interpolated from a coarser flow map of course).
Water Flow in Portal 2 (this also talks about rendering “debris”).
How it’s done
The basic technique is to scroll your texture at different directions/speeds for each pixel. This of course distorts the texture over time, so you fade it out as it starts being noticeable, and fade in another texture that is just beginning to be distorted. You ping pong between them and the end result it fairly startling given that you’re basically just distorting two normal maps. You can even make swirling whirlpools and such.
Since this is done per pixel, we don’t really need to worry about triangulating anything, and we can just read from a world flow map that specifies the direction and strength at each point.
The above diagram shows the blend amounts for the two textures (shown in red and blue) as we progress over time. As the phase of each texture goes from 0 to 1, we offset the coordinates from which we sample by increasing amounts in the direction of the flow vector. Some HLSL:
// Too much stretching looks bad. flowVector *= MaxNormalDistortion; // We base our texture coordinate on where we are in the world. float2 texCoord = worldPosition.xz * NormalTextureScale; float cycleOffset = noiseSample; float phase0 = (NormalPhase01.x + cycleOffset * PULSE_REDUCTION_MAX_EFFECT) % 1; float phase1 = (NormalPhase01.y + cycleOffset * PULSE_REDUCTION_MAX_EFFECT) % 1; // Sample normal map. Normals are already in (-1,1) format. float3 normal0 = tex2D(NormalSampler0, texCoord + NormalOffsets0011.xy + flowVector * phase0); float3 normal1 = tex2D(NormalSampler1, texCoord + NormalOffsets0011.zw + flowVector * phase1); // Blend the two. float normal0BlendAmount = 2 * abs(phase0 - 0.5); // This essentially creates a saw wave that determines the blending. float3 final = normalize(lerp(normal0, normal1, normal0BlendAmount));
I always find it much more productive to work in a small test app rather than my actual game. Because of quick build times and start up times, It’s much quicker to iterate and experiment with various parameters. Once I have something I think is good, I’ll take the time to incorporate into my actual game (which is going to be a bit of a challenge given I use deferred rendering).
So I have provided this test app here for you to try out. The water rendering isn’t that great (no refraction, no geometry-based wave action, and only a hacky sky reflection). The focus is on experimenting with stream flow and adjusting the parameters.
The flow map
The flow map is simply a texture whose R and G components are filled with values that represent the flow vector at a point.
In my test app, I let you paint the 3d world with “flow”. They flow can also be visualized with flow lines:
You can see it is a fairly straightforward thing to make the edges of a river flow more slowly than the center (as long as you have sufficient flow map resolution).
Random offset each cycle
To give some visual improvement and avoid the same waves appearing in the same spot each cycle, we can shift the textures by a random amount before the next cycle starts – at the instant when the texture is invisible, so it isn’t noticeable. The offsets are NormalOffsets0011 below:
// Sample normal map. Normals are already in (-1,1) format. float3 normal0 = tex2D(NormalSampler0, texCoord + NormalOffsets0011.xy + flowVector * phase0); float3 normal1 = tex2D(NormalSampler1, texCoord + NormalOffsets0011.zw + flowVector * phase1);
The pulsing problem
The other articles on this technique refer to a “pulsing” problem and how to fix it. They don’t really describe why this happens, but it is pretty straightforward. When you’re blending two detailed textures, the end result tends to be kind of muddy. This makes sense since everything is averaged. You can see this clearly in the normal maps below:
Since the water flow technique involves a time period when all of one texture is visible, then a blend between the two, then all of the other, you’ll see alternating high detail and lower detail. This can be pretty distracting, and it can be mitigated somewhat by blending based on a noise texture.
The HLSL looks like:
float cycleOffset = noiseSample; float phase0 = (NormalPhase01.x + cycleOffset * PULSE_REDUCTION_MAX_EFFECT) % 1; float phase1 = (NormalPhase01.y + cycleOffset * PULSE_REDUCTION_MAX_EFFECT) % 1;
There are a few gotchas that come with this.
One problem is that we can no longer randomly offset a texture at the end of its cycle (while it’s invisible), because now a texture won’t be completely invisible at the end of its cycle (so you would see a noticeable jump). The noise blending causes parts of the texture to “leak” beyond its cycle. In my opinion, mitigating the pulsing effect improves visual quality much more than the random offset, so I find this an easy trade-off.
Another problem is that you can’t use the entire spectrum (from black to white) of the noise map to blend (e.g. having some parts be all one texture and some all the other). This subjects textures to increasing distortion and it will look like different pieces of the water are moving at different speeds. So, we need to moderate this a bit. Of course, that also means we don’t eliminate the pulsing as much. So there is a value to tweak here. In my shader, I default it to 0.5 (so we use half the dynamic range of the noise map, essentially). This is the PULSE_REDUCTION_EFFECT you see in the shader snippet above. The pulse can be seen in the video below:
There are probably still some improvements that could be made here, but I think it looks acceptable. I may still play around with ways to hide the texture distortion and/or pulsing a little more, who knows. Perhaps, for normal maps, we can strengthen them when they are at the halfway point of the blend to make them seem less muddy.
Another thing to note is that this technique isn’t very compatible with directional waves; that is, normals that have a distinct orientation. Perhaps there is some way we could work in a texture rotation based on the flow direction that would enable this. Something more to think about.
Another note is that we can’t move the water at arbitrarily fast speeds with this technique. We are limited by the size of the normal map texture we use, and how much distortion we are willing to live with.
I think this technique could also let us render some pretty realistic lava flows too, though I haven’t tried substituting the right textures/lighting to try this out.
The demo app (a visual studio 2010 solution in XNA) is here: