Real-Time Physically Based Rendering and BRDFs

_config.yml

The topic of BRDFs can often be confusing, so I’m going to attempt to break down what they are and how they relate to physically based rendering as simply as possible. We will be writing GLSL for real-time PBR rendering as we go along, the above image was rendered using the shader.

For this guide I’ve gathered together information and images from disparate sources including Wikipedia, jMonkeyEngine, Marmoset, UnrealEngine and nVidia. I’ve referenced each source as carefully as I can, but if any of the content owners are unhappy with their usage then I will be happy to replace them right away.

Bidirectional Reflectance Distribution Functions

When a process is bidirectional it is taking place in two, usually opposite directions. A BRDF is a function that gives the reflectance of a point on a surface given two directions, the direction of the viewer (Wr) and of the light (Wi).

_config.yml

  • source: 1

A BRDF returns the ratio of reflected light from the surface a viewer receives, this is known as the radiance. BRDF inputs Wr and Wi are usually defined as a pair of angles, azimuth (Φ) and zenith (θ), so the BRDF is considered a 4 dimensional function.

Measuring and recording the reflectance of a real surface is a difficult and time consuming process that uses a piece of equipment knows as a gonioreflectometer. The MERL database records discretely sampled data in a BRDF lookup table for 100 different surfaces. Each MERL BRDF contains around 33mb of data.

_config.yml

  • source: 2

Instead of having to create large amounts of data for each surface we want to render, analytical models have been developed which, given some more parameters can approximate reflectance for a range of surfaces. Some well known ones are:

  • Lambert
  • Phong
  • Blinn-Phong
  • Cook-Torrance

For our purposes we will treat Wr as 3-dimentional vector. And rather than single point lighting we want to sample an integral of many vectors over a hemisphere or cone, essentially blurring incoming light, called the irradiance value.

_config.yml

  • source: 1

Physically Based Rendering

PBR can be defined in many different ways, but in this article I will stick to the terms used in Epic MegaGames Unreal Engine. PBR combines analytical BRDFs to more closely model real BRDF sampled reflection data, but still retain enough approximation to work in real-time. A PBR shader can represent a range of materials and physical effects, in this case there are two main parameters to control these effects, the micro-surface roughness and metalness. These are usually ratios stored in textures.

float roughness = texture(roughness_sampler, texcoord).r;
float metalness = texture(metalness_sampler, texcoord).r;

This image shows Fresnel reflectivity on a car bonnet and and micro-surface diffusion on plastics.

_config.yml

  • source: 3

Radiance

Two anayltical BRDFs are required for different types interaction of light with the surface, reflection and diffusion. Instead of using them to calculate final radiance values, ours will give a radiance ratio, which then needs to be multiplied by irradiance to give our final values.

_config.yml

  • source: 4

Diffusion Radiance Ratio:

Light which penetrates the surface and is diffused. The well known Lambert BRDF can do this and diffuses light uniformly. After computation the reflectance is multiplied by the light color and also the surface colour, otherwise known as the albedo, because it has penetrated the surface. Because Lambertian radiance does not change with view angle, the ratio is only a constant of 1.

Reflection Radiance Ratio:

Light which is reflected in a mirror like way. A BRDF suitable for this task is Cook-Torrance which can mimic physical effects such as Fresnel reflectivity and micro-surface diffusion (blurry reflections) defined by the roughness parameter.

_config.yml

  • source: 4

The Cook-Torrance radiance ratio is difficult to compute, but the good people at Epic MegaGames have open sourced a cheap approximation, how it was arrived at is a topic in itself but this version dispenses even with lookup-tables, and so is particularly suitable for mobile.

half3 EnvBRDFApprox( half3 SpecularColor, half Roughness, half NoV )
{
    const half4 c0 = { -1, -0.0275, -0.572, 0.022 };
    const half4 c1 = { 1, 0.0425, 1.04, -0.04 };
    half4 r = Roughness * c0 + c1;
    half a004 = min( r.x * r.x, exp2( -9.28 * NoV ) ) * r.x + r.y;
    half2 AB = half2( -1.04, 1.04 ) * a004 + r.zw;
    return SpecularColor * AB.x + AB.y;
}

// The less mirror like a reflection is the more the albedo
// colour is blended in as the surface becomes further penetrated.
// Epic chose a lower boundary of 4% metalness to represent
// non-reflective materials.
half3 specular_color = mix(vec3(0.04), albedo, metalness);

float NoV = max(0, dot(normal, view_normal));

half3 relective_radiance_ratio = EnvBRDFApprox(
    specular_color, roughness, NoV)
  • source: 5

Irradiance

Calculating irradiance could be difficult because it means evaluating an integral of many rays. A more efficient way is to approximate the integral by pre-convolving an HDR cubemap so values can be looked up by direction.

Diffuse (Lambertian) Irradiance Map.

_config.yml

  • source: 6

Micro-surface Diffusion (Cook-Torrance) Irradiance Map.

The complete Cook-Torrance approximation requires splitting calculation between radiance function (EnvBRDFApprox), and an environment cube-map. Convolution is done in such a way to encode the Cook-Torrance BRDF in to it. This is called the “Split Sum Approximation”.

_config.yml

  • source: 6

Correctly convolving a cube-map is computationally expensive, so in this case we will trade off accuracy for performance by using a cube-map with standard mip-maps created with glGenerateMipmap.

Final Radiance Calculation

The two radiance ratio values output by the BRDFs now need to be multiplied by incoming irradiance.

Diffusion Radiance

Simpily look up the irradiance by sampling a cubemap using the surface normal. Multiplying by 1 drops out of the equation.

vec3 diffuse_radiance = albedo * textureLod(
    environment_sampler,
    surface_normal,
    lowest_mipmap_value).xyz;

Reflection Radiance

The heuristic to look up a mipmap level from roughness comes from UnrealEngine.

half RoughnessToMipmapLevel(half roughness)
{
    return max(0.0, MAX_MIPMAP_LEVEL - 1.0 - (1.0 - 1.2 * log2(roughness)));
}

vec3 reflection_radiance = relective_radiance_ratio * textureLod(
    environment_sampler,
    reflection_normal,
    RoughnessToMipmapLevel(roughness)).xyz;

Energy Conservation

Diffuse and reflective radiance can not just be added them because the surface would be emitting more light than arrived on it, so instead diffuse and albedo are blended using the metalness ratio.

diffuse_radiance = diffuse_radiance * lerp(albedo, vec3(0.0), metalness);

_config.yml

  • source: 4

A Word on Gamma

Gamma colour space is designed to be displayed on monitors which have non-linear brightness curves, most file formats store data in gamma colour space. When sampling the albedo colour it much be changed from gamma to linear space, then it can be used in any computation. After the final pixel value has been evaluated, it then must be translated back to gamma space to be displayed on a monitor. The article The Importance of Being Linear from nVidia gives an in-depth guide on this process.

_config.yml

  • source: 7

In the first image lighting is computed in the correct colour space, in the second no correction is performed.

Partial GLSL

  • environment_sampler: Seamless, HDR, environment cubemap with mipmaps generated.
  • roughness_sampler: Roughness Ratio texture map
  • metalness_sampler: Metalness ratio texture map
  • albedo_sampler: Material colour texture map
half3 EnvBRDFApprox(half3 SpecularColor, half Roughness, half NoV)
{
    const half4 c0 = { -1, -0.0275, -0.572, 0.022 };
    const half4 c1 = { 1, 0.0425, 1.04, -0.04 };
    half4 r = Roughness * c0 + c1;
    half a004 = min(r.x * r.x, exp2(-9.28 * NoV)) * r.x + r.y;
    half2 AB = half2(-1.04, 1.0 ) * a004 + r.zw;
    return SpecularColor * AB.x + AB.y;
}

half RoughnessToMipmapLevel(half roughness)
{
    return max(0.0, MAX_MIPMAP_LEVEL - 1.0 - (1.0 - 1.2 * log2(roughness)));
}

void main()
{
    vec3 surface_normal; // from normal map
    vec3 view_normal;    // from eye position to surface
    vec3 reflection_normal = reflect(view_normal, surface_normal);

    float roughness = texture(roughness_sampler, texcoord).r;
    float metalness = texture(metalness_sampler, texcoord).r;
    vec3 albedo = GammaColorToLinearSpace(
        texture(albedo_sampler, texcoord).rgb);

    half3 specular_color = mix(vec3(0.04), albedo, metalness);
    float NoV = max(0, dot(normal, view_normal));

    half3 relective_radiance_ratio = EnvBRDFApprox(
        specular_color, roughness, NoV);

    vec3 diffuse_radiance = textureLod(
        environment_sampler,
        surface_normal,
        MAX_MIPMAP_LEVEL).xyz;
    diffuse_radiance = diffuse_radiance * mix(albedo, vec3(0.0, 0.0, 0.0), metalness);

    vec3 reflection_radiance = relective_radiance_ratio * textureLod(
        environment_sampler,
        reflection_normal,
        RoughnessToMipmapLevel(roughness)).xyz;;

    vec3 final_radiance = diffuse_radiance + reflection_radiance;

    // Perform tone-mapping of your choice to convert
    // HDR final_radiance to LDR here

    return LinearColorToGammaSpace(final_radiance);
}

Final Result

_config.yml _config.yml

References

Written on January 1, 2018