7 Comments

Cleaning up the HDR pipeline

In this post I’ll go through the steps I’ve taken to clean up the HDR in my engine.

 

sRGB vs linear

I had never really given this much thought until now. I simply sampled my textures, performed lighting calculations on samples (giving me an HDR value), and then tone-mapped (mapping the HDR value to something representable on the screen).

The problem is:

  • All the lighting calculations assume everything is in linear space (e.g. 0.5 is twice as bright as 0.25).
  • The source textures are almost always in sRGB.
  • The monitor assumes sRGB

If we’re just displaying the textures directly on the screen, everything is fine – both the monitor and source texture are in the same color space. But if we’re performing lighting calculations in between, we’ll be doing it wrong.

Consider the following base texture:

Example_1_OrigTexture

The RGB values for left and right are 168 and 231, respectively. The right-hand is pretty significantly brighter than the left.

Now let’s say we light the texture with a light of intensity 0.3. Multiplying the RGB values by 0.3, we get:

Example_2_Lit03_inSRGB

Uh-oh. The right side is still brighter, but not by much (the values are 50 and 69, respectively).

But now, let’s convert things into linear space first, then light them (by multiplying by 0.3), and then convert back into sRGB:

Example_3_Lit03_inLinear

The relative brightness of the two sides of the texture is much better-preserved.

Here’s an example of a scene without gamma correction, and the same scene where gamma curves were taken into account and lighting performed in linear color space:

 

GammaExample1

 

Overall, performing the lighting in a linear color space leads to a more realistic result.

sRGB uses a gamma value of 2.2, which means that the value sampled from the sRGB texture is raised to the power of 2.2 to convert to linear color space. Likewise, the result of our light accumulation buffer is raised to the power of (1 / 2.2) to convert back to sRGB for the screen.

DirectX (and OpenGL presumably) have sampler states that let you specify a texture be treated as sRGB (and thus converted to linear when you sample from it), and a render state that lets you write linear color space values into sRGB. I’m using XNA (based on DirectX9) however, and this is not supported there (perhaps MonoGame may support it in the future?). So for now, I have to put this functionality in the pixel shaders themselves.

There is an additional consideration to make. I’m using a deferred renderer. When should I do the sRGB to linear conversion? When I sample from the original source texture and write to the albedo buffer in my G-buffer? Or should the albedo buffer also be in sRGB, and then the conversion made by all lighting shaders that sample from the albedo buffer?

It turns out that if you’re using an 8-bit per channel albedo buffer like I am, 8 bits isn’t really enough to store accurate blacks/darks in linear color space.

Here’s the albedo buffer for a scene, where values are basically just direct samples from the source texture (and thus sRGB):

AlbedoInsRGB

 

If we converted it to linear, it would look like this:

 

AlbedoInLinear

 

These are the linear RGB values we want to use for the lighting equations, but our 8 bits has left us with very little information left in the dark areas. Let’s zoom in on a portion of the cliff and brighten it up to compare with the sRGB albedo buffer:

 

Banding

 

There’s a significant amount of banding visible. This wouldn’t be an issue with a 16bit per channel albedo buffer, but I don’t want to allow myself that luxury. Simplest to just store sRGB in the albedo buffer.

So the final pipeline looks something like this:

Pipeline

 

We don’t have the banding problem in the light accumulation buffer, as it is 16 bits per channel (D3DFMT_A16B16G16R16F) or 10 bits (D3DFMT_A2B10G10R10). I have a toggle to switch between them (for darker scenes 10 bits is often not enough, admittedly).

Note that when using linear filtering when sampling from textures, multiple texture values are being blended together. If this blending happens in sRGB space, the result will be incorrect. The blending should happen in linear color space instead. The effect is fairly minor in most cases though, so I’ve chosen to ignore it. You might not be affected by this at all if you have hardware support for sRGB sampling (as mentioned above).

Source texture albedo

All this talk of albedo got me more serious about using physically-accurate values for albedo. The vegetation was often much brighter than the surrounding terrain, for instance. One particularly egregious example was the brightness of the diffuse texture of my character models. It was particularly noticeable under cloudy skies (little directional light):

 

BadAlbedos

 

Completely out-of-whack with the surroundings. With the diffuse texture brought more into line with the surrounding terrain, it now looks much better:

 

GoodAlbedos

 

To help with this, I added functionality to my content pipeline to adjust the albedo of the models’ textures when processing the models. I can specify an albedo value (say, 0.18 for forest floor), and the texture will be brought into line with this. This was a bit easier than going in an modifying all my textures in Photoshop.

Tone-mapping, auto-exposure

The purpose of tone-mapping is to bring your HDR light accumulation buffer values into visible display range (and look nice)!

One thing I always had problems with was the auto-exposure. I need to handle scenes ranging from bright snowy sunshine to cloudy nights in the jungle. I have code that measures the average, median, min and max luminance of a scene (by downsampling the result of the light accumulation buffer). I was using all these in a Reinhard tone-mapping function, but it was fairly unstable (stuff would often be washed out or too dark).

I ended up getting rid of the “middle grey” and “average luminance” values used in the form of the Reinhard equation I was using. I’m now using a simpler form of the Reinhard that basically only has as input the max scene luminance. This is proving to be a lot more stable.

I also implemented the filmic tone-mapping described here. I think it does look a little better than Reinhard (more “punchy”). There is a lot more information here too.

 

Tonemaps

 

I have a toggle that lets me switch between these two tone-mapping algorithms and also a linear tone map (pixel luminance divided by max scene luminance).

It’s really nice to have proper HDR support, and attention paid to giving objects proper (real-world) albedos. With a proper lighting pipeline set up, I’m now free to work on improving my ambient lighting/global illumination. More on that in my next post.

 

 

 

 

Advertisements

7 comments on “Cleaning up the HDR pipeline

  1. Very cool. Always very interesting to read your posts – everything is clear. Keep going! Btw, I prefer to convert textures to linear space to save a couple of heavy shader instructions.

  2. In our deferred game we use Gamma2Linear and Linear2Gamma helpers in the shaders to do the conversions. It works, but it is very easy to miss one and get wrong results that are hard to detect at times. It also is pretty expensive instruction wise.

    For this reason we’re working on adding SRGB support to MonoGame. The right way for DX11 and above is to use SRGB surface formats which automatically do the conversions using dedicated hardware. See https://github.com/mono/MonoGame/issues/1995.

  3. […] the fixes to my HDR pipeline and albedo, the ambient lighting/global illumination, ambient occlusion map, and various other tweaks, I […]

  4. […] how blending normals into the normal part of the G-buffer would work, and also blending in the sRGB color space. It turns out neither is a big visual […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Just another WordPress site

Just another WordPress.com site

The Space Quest Historian

Adventure game blogs, Let's Plays, live streams, and more

Harebrained Schemes

Developer's blog for IceFall Games

kosmonaut's blog

3d GFX and more

Halogenica

Turn up the rez!

bitsquid: development blog

Developer's blog for IceFall Games

Game Development by Sean

Developer's blog for IceFall Games

Lost Garden

Developer's blog for IceFall Games

Memories

Developer's blog for IceFall Games

Casey's Blog

Developer's blog for IceFall Games

Blog

Developer's blog for IceFall Games

Rendering Evolution

Developer's blog for IceFall Games

Simon schreibt.

Developer's blog for IceFall Games

Dev & Techno-phage

Do Computers Dream of Electric Developper?

- Woolfe -

Developer's blog for IceFall Games

Fabio Ferrara

Game Developer

Clone of Duty: Stonehenge

First Person Shooter coming soon to the XBOX 360

Low Tide Productions

Games and other artsy stuff...

BadCorporateLogo

Just another WordPress.com site

Sipty's Writing

Take a look inside the mind of a game developer.

%d bloggers like this: