Mar 20 2022

Warping - A Refraction Shader

Update March, 2022: Ellie and I will be participating in the Bright Moments Berlin event in April! Minting will be done in-person and via surrogate. If you’re able to, come out and support us!

More info at: brightmoments.io/cryptoberlin


Background

This post is part of a series on the Reverse Zoology project. Check out the previous post if you haven’t yet!

Like many of my projects, this was based in modeling some physical phenomenon with computers. I was staring through a thick glass bowl at my dinner table, appreciating the way it bends the light coming in. I was roughly aware that different mediums will bend light, and wanted to play around with Shadertoy, strengthening my parallel programming skills. It worked out surprisingly well that the Reverse Zoology project required a similar visual effect.t

The idea, for Reverse Zoology, was to upload a video frame to the GPU. Then, using a fragment shader, set each pixel in an output image by tracing a single ray of light from the screen. The ray would hit some glass-like material, refract, and sample the initial texture behind the material. Essentially looking at a picture through some wavy window.

Creating Glass

The Height Map

For the refractive material we imagine a thick piece of glass sitting on our image. The bottom of the glass, touching the image, is flat, but the top of the glass is bumpy. Because of this inconsistent height, light rays will travel in different directions, based on the angle of the surface of the glass.

An easy way to represent this glass is using a height map A height map is an image where each pixel stores surface elevation data. One quick way to achieve this, generatively, is by using gradient noise.

Given some noise function, with the following signature:

float noise(float x, float y, float z) 

A height map may be created where x and y are proportional to the pixel coordinates and z allows for animation.

Here’s a simple Shadertoy example:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
  // uv is normalized coordinate space
  // values are [0, 1)
  vec2 uv = fragCoord/iResolution.xy;

  // iTime is an always incrementing Shadertoy builtin
  // but it is a little too fas
  float time = iTime * 0.25;

  // get noise at coordinate based on uv and current time
  vec3 pos = vec3(uv.xy, time);
  float n = noise(pos);

  // output to screen, 
  // RGB channels are set to n to make it greyscale
  gColor = vec4(n, n, n, 1.0);
}

Gradient Noise

I’ve explicitly avoided implementing a noise function, and that’s for a few reasons. First, there is no single type of noise that is “correct”. Processing implemented its own Perlin noise, based on the original concept developed by Ken Perlin. A popular alternative is OpenSimplex, which has less visual artifacts, especially at higher dimensions. Another fun type of noise to try out is Worley. Any of those are perfectly fine and will yield unique yet interesting results.

Beyond tweaking the underlying noise algorithm, there exists domain warping. Domain warping, abstractly, is mapping a function’s input space to a different input space. Inigo Quilez has an outstanding article on this exact concept and how it specifically can be used with deforming noise fields. The idea is simple: use the noise function to change the input space of itself.

Here is the previous example, updated:

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
  // uv is normalized coordinate space
  // values are [0, 1)
  vec2 uv = fragCoord/iResolution.xy;
  
  // iTime is an always incrementing Shadertoy builtin
  // but it is a little too fas
  float time = iTime * 0.05;
  
  // get noise at coordinate based on uv and current time
  vec3 pos = vec3(uv.xy * 3.0, time);
  
  float v1 = noise(pos + vec3(0.0, 0.0, 0.0));
  float v2 = noise(pos + vec3(4.59, 3.33, 7.11));
  
  float n = noise(vec3(uv.xy, time) + vec3(v1, v2, time) * 4.0);

  // output to screen, 
  // RGB channels are set to n to make it greyscale
  fragColor = vec4(n, n, n, 1.0);
}

And this process can be repeated again, creating even more complex patterns.

The Differentiable Height Map

In order to properly calculate refraction, we need more than just a height map. We need to figure out the gradient of the surface. This is a measure of the slope of the surface, and, when represented as a normalized vector, it is called the normal. Why do we need the normal? For that, we look at the law of light refraction, also known as Snell’s Law.

Snell’s law describes the relationship between the angles of incidence and refraction when a light ray passes through different mediums. In effect, Snell’s law lets us calculate the angle a light ray will travel after passing into a medium if we know the angle at which it hit the medium. This is also affected by the actual material, using what is called an index of refraction. Water refracts light less sharply than glass, for example, when leaving air. Wikipedia has a nifty list of common refractive indices for common materials.

Now we need to calculate the normal of the height map, at each point. One way to do this is by sampling nearby points in the noise field, using them to calculate the change in height, and averaging it. But that can be slow and requires querying noise multiple times, for each ray of light. Luckily, Inigo Quilez comes to the rescue, with an article that provides a gradient noise implementation, in 3D, that includes the normal.

Calculating Light Trajectory

Now that all the pieces are in place, to generate a warped image, we need to:

  1. Calculate the height of the surface, using gradient noise
  2. Determine the distance from the screen to the surface
  3. Refract the ray of light based on the surface normal (from the new noise function) and refraction indices, yielding a new direction for the light ray
  4. Calculate the distance the ray of light travels before it hits the image
  5. Sample the pixel hit

Here’s a cutaway of this process:

The first step first step is easy,

// normalize coordinates, i.e. map values to [0, 1)
vec2 uv = fragCoord/iResolution.xy;

// sample noise at X and Y coordinate. Z for animation.
vec4 noise = noised(vec3(uv.xy, iTime*timeScale));

// arbitrary height. higher values will warp images more
float height = 0.75;

// 3D position where this ray of light hits the surface
vec3 surfacePosition = vec3(uv, noise.x + height);

Now we refract the ray, using the handle OpenGL refract function. Reference and implementation can be found in the OpenGL docs.

// from the derivative noise
vec3 surfaceNormal = noise.yzw;

// ratio of indices of refraction
float eta = n1 / n2;

// incoming ray is only moving in the Z axis
vec3 rayIncoming = vec3(0.0, 0.0, -1.0);
vec3 rayRefract = refract(rayIncoming, surfaceNormal, eta);

Next, we need to calculate how far the ray travels and where it hits the origin plane, where the image is located, sampling the pixel value. I found this blog post, in which the author derives a formula for a ray-plane intersection.

Applying it,

const vec3 imagePlanePos = vec3(0.0, 0.0, 0.0);
const vec3 imagePlaneNormal = vec3(0.0, 0.0, -1.0);
    
float travelAmt = dot(imagePlanePos - surfacePosition, imagePlaneNormal) / dot(rayRefract, imagePlaneNormal);
vec3 exitPoint = surfacePosition + rayRefract * travelAmt;

The last piece is to sample the original image. With Shadertoy, I just used one of the built-ins, as a texture. It is important to perform a bounds check here, as the warping effect may bend light rays past the coordinates of our image. One option is to wrap around, taking the modulus of width and height. I chose, instead, to use a default color.

Here’s the final bit of code:

vec4 color = (exitPoint.x > 1.0 || exitPoint.y > 1.0 || exitPoint.x < 0.0 || exitPoint.y < 0.0) ?
	vec4(1.0, 0.0, 0.0, 1.0) :
	texture(iChannel0, exitPoint.xy);

Wrapping It Up

Here is a (mostly) finished example.

shadertoy.com/view/NtlyDr

The height, noise intensity, scale, and indices of refraction may all be parameterized. Playing around with extremes is quite rewarding!

But that’s not all. One significant thing that was skipped is the fact that light refracts also based on wavelength. That is fundamentally why a prism works; white light is a composite of many wavelengths, which disperse. I’ve found a few interesting write-ups about modeling this phenomenon. I leave that up to the reader to implement. The Physically Based Rendering book also has a whole section on spectral representation in rendering.

Something I’ve noticed, but not yet solved, is a tendency for the warping to pull the image down and to the right. I have no clue why this is happening. So if you, kind reader, understand, please contact me. Or write a scathing follow-up on your own blog, explaining all the places my math was incorrect.

As always, feedback is greatly appreciated!