Realtime Distance Field Textures in Unreal Engine 4

For a long time I wanted to be able to generate Distance Field textures on the fly to create various visual effects.
Lately I needed this effect to replicate the Photoshop “glow” filter that you can add on layers. The goal was to apply a “bloom” or “glow” on the UI I’m designing for my current project.
This article was written based on Unreal Engine 4.20

Distance Field ?

Quick recap for people who don’t know what distance fields are : a distance field is a grid that store in each cell the shortest distance to another cell with specific properties. In the case of a texture a cell is simply a pixel (or texel). It can be used to generate Voronoi patterns for example :



(This example was made with Substance Designer)

How does that help for shapes and text in UI ? Well, imagine a black and white texture where white is the text and black is the empty area around. The algorithm will mark the text itself as the important element to compute a distance against, giving the following result :


Now let’s make it easier to visualize, for example by dividing the distance in multiple steps :


This gives the same result as the “precise” mode in the Photoshop “glow” filter. If you need something softer, you can simply blur it afterward.

How does it work ?

Once understood, the method is actually quite simple to operate. However the first time it may not seem that easy to grasp. I know I myself needed a few explanations before I was able to finally visualize it in my head.
The process relies on the Jump Flood method. This method works by repeating specific operations to reach the final result which has the advantage to have a fixed cost (so it is not affected by the content of the image to process) and it also well suited for running on GPUs.

The process works in two main steps : creating an initial mask then iterating on this mask with the Jump Flood algorithm.

  • The Jump Flood algorithm works by looking at the neighbors of a given pixel and writing down into the current pixel the coordinates of the pixel with the smallest distance found. This means the process evalute 9 pixels (the center point plus the neighbors).
  • This process is then repeated (new pass) but the distance to the neighbor pixels is divided by half, until the next neighbor pixels are touching. The smallest distance is computed by comparing the current pixel UV position and the value contained by one of the neighbor pixel. Each time a smaller distance is found the current pixel value is updated with new UV coordinates.


Because for each pass we read the neighboar pixels value and write down a result, that means you can accumulate information that will end up with the smallest distance to a given point based on the original mask. The final distance result saved into the texture can then be intepreted to generated Voronoi patterns or other kind of effect.

To summary :

  1. Create a mask (valid values vs invalid)
  2. Find closest position in neighbor pixels and write down the coordinates into the texture
  3. Repeat the process until the distance to the neighbors is the next pixel
  4. Extract the distance from the UV coordinates computed

Replicating the process in Unreal Engine 4

It is possible with regular shaders and Render Target Textures (or more precisely Canvas Render Target 2D) to replicate the algorithmic in Unreal Engine 4. The idea is to create a shader for each part of the process and render its result into a texture to store it. Then it’s just a matter of calling the right process at the right time. 🙂

We will need :

  • A shader for the creating of the mask
  • A shader for computing the Jump Flood
  • A shader for displaying the result
  • A blueprint actor to connect everything
  • 2 Canvas Render Target 2D assets

1 – Creating the assets

In the content browser of the Unreal Engine 4 editor, right-click to open the menu and choose “Materials & Textures” then “Canvas Render Target” :

 

Name the file as something like “RTT_A” (RenderTargetTexture_A) then double click to edit it’s properties in the Texture Editor. In the properties we wan’t to be sure to change the following settings :

  • Size X and Y : 512
  • Address X and Y : Clamp
  • Render Target Format : RTF RG16f
  • Texture Group : 2D Pixels (unfiltered)

The size can be changed later if needed, but for this tutorial we will stick to this resolution. The adress mode is set to clamp to avoid repeating the texture and computing incorrect distance values at the borders. The format is set to RG16f because we only need the Red and Green channel for the Jump Flood algorithm, however we need the floating 16 bits (to store big numbers), so be sure to choose the right value. Finally it is important to set the Texture Group to unfiltered, otherwise the bilinear sampling will creates unwanted side effects. The nearest sampling (implied by the unfiltered mode) will guarantee we only read one pixel at the time when performing the algorithm.

 

Next we will create the empty Materials, still in the Content Browser via the right-click menu :

Create three materials that I suggest naming to something like MAT_Distance_CreateMask for the first one, MAT_Distance_JumpFlood for the second one and finally for the third material (used for previewing the result) I suggest naming it as “MAT_Distance_Preview“.

2 – Connecting things together

Before diving into the details of the materials, I suggest to create the blueprint actor that will handle the updates of the Render Targets. I’m using the begin play to setup the materials and then the tick to render everything else.

In the Begin Play I create three Dynamic Material Instances in order to be able to connect the Render Target textures and tweak the parameters :

  • The first output of the sequence node is used for creating the dynamic material instance for the “Create Mask” material.
  • The second output of the sequence node is used for creating the dynamic material instance for the “Jump Flood” material.
  • The last output of the sequence node is used for creating the dynamic material instance for the “Preview” material which I assign on a plane mesh component inside the actor. I also connect the Render Target B (RTT_B) to the shader input because it will be the last Texture to be rendered.

Let’s continue with the Tick which is split in two parts :

  • The first part named “Setup Mask” simply render the “Create Mask” material instance into the Render Target A (RTT_A). I have also a boolean named “Use B” and defined globally to the actor that I set to false. This boolean is used by the next part of the Tick sequence.
  • The Jump Flood part relies on a loop which alternate between the two render targets (click on the image for a bigger view of the node network).

    Here is the breakdown of this part :

    1. As we will see later for a 512×512 texture we need 9 iterations, that’s why I have a For loop node going from 0 to 8
    2. The next part is the update of the step distance. The step variable is defined globally and resetted to 1 when the For loop finish. Because Unreal handle texture coordinates between 0 and 1 as floating points, we don’t have to specify the exact pixel distance and instead can use a float. That’s why before drawing an iteration of the Jump flood I divide the step by two (to get half of the distance). 0.5 will equal to 256 pixels for a 512 texture.
    3. Next is the branch node that reads the boolean mentionned before which allow us to go back and forth between the two Render Targets.
    4. The top part of the graph coming from the “true” output of the branch node starts by setting the texture input of the Jump Flood material to the render target we want to read. Then we update the step distance with the float we defined just before. Finally we call the rendering update of the Render Target with the Jump Flood material.
    5. The last part simply toggle the value of the boolean (true/false).

That’s it !
The reason why the computation is done in the tick is just in case you need to regenerate the mask (in case the source texture is dynamic and its content changes). If you only need to generate once the Distance Field, then there is no need to update it again and it could become a one-time function (in the Begin play for example).

3 – Creating the preview material

I will not spent too much time on this part as we will modify it later. The idea here is to translate the output of the algorithm into a more readable information. So let’s edit the material “MAT_Distance_Preview” :

Basically what this material does is compute a distance from the original UV coordinates (TexCoord node) and the UV coordinates saved inside the final Render Target. Then we do a few math operations to visualize that distance. The result will be similar to this :

4 – Creating the “Mask” material

Now let’s focus on how the Jump Flood actually works. The material “MAT_Distance_CreateMask” is used to compute the information about our source image that we will then feed to the Jump Flood algorithm. The idea is to create something similar to a mask : we want to separate valid and invalid values. Invalid values are areas where we want to put a distance value into, while valid values are our original shapes that we want to preserve.

Because of the nature of how Distance Fields work, the border of the source mask can only be 0 or 1. This means if you have an anti-aliased shape as your source, you will end up with aliasing in the mask. However this is usually not an issue since we can use the distance field to refine the shape later. This is why Distance Fields are used for low resolution fonts !

Since this material is only used to render things into textures, there is no need to use all the regular material features, therefor we can optimize it a bit by switching the “Material Domain” setting to “User Interface“. This will generate a material with less instruction that the regular “Surface”.

Now let’s create the node network :

As you can see the graph is divided in two big parts :

  • The color distance : this part may not be necessary depending of your source image. In my case I have a colored image without an alpha channel. So to create a mask I want to isolate the black background from the other colors.
    I use a distance comparison rather than a grayscale convertion (dot product) since this method gives me a more accurate result. Doing a grayscale conversion may introduce artifacts because of pixel transition (black to color) which can be hard to refine.
    Also dark colors may be impossible to isolate from black because of their very low luminosity. A distance is much more precise because it compares the colors as being inside a 3D cube.
  • The filling : this output a valie based on the mask. Black (0) areas will have a very big number (here 4096) while White (1) areas will have the original UV coordinates. This step will allow later to directly process the value stored in the texture when we will do the distance computation.
    Theorically you could write very big numbers (the maximum supported by the RG16f format for exmaple) in the black areas so that the algorithm can work with any texture size, however in practice I noticed some strnage behavior when using values higher than 4096. Since in my project I stick with 512×512 textures, 4096 is more than enough.

5 – The “Jump Flood” Material

After the mask has been created we have to implement the actual algorithm. This is done in the material “MAT_Distance_JumpFlood “. As explained a bit before we read the pixels to compute a distance and store the result into the texture. We repeat the process multiple times by a number of steps defined by the texture size. In Unreal this translate by the usage of two render targets : we need one to read and one to write (then we switch roles).

As for the Mask material, let’s make sure our Jump Flood material is set to the “User Interface” material domain to make it cheaper to run. Now let’s edit it with the following graph :

The graph is divided into the following parts :

  • Texture input node (“txt_input“), which will receive the mask (first pass) or the other iterations of the Jump Flood algorithm.
  • TexCoord” (just the regular UV coordinates).
  • StepDistance” which is a parameter that will be controlled externally by the blueprint actor.
  • Custom” node, where we will put the Jump Flood hlsl code.
  • The Component Mask node : totally useless in this case, can be skipped. 😀

Now let’s focus on the Custom node since this is where the actual work happens. First be sure to setup it as the following :

The output type needs to be a Float 2 since we are rendering into Render Targets that only have a Red and Green channel (RG16f). For the rest, it’s just the paramerters that we will feed to the code of the node. As for the code itself, here is what it looks like :

float BestDistance 	= 9999.0;
float2 BestCoord 	= float2(1.0, 1.0);

for( int y = -1; y <= 1; ++y )
{
	for( int x = -1; x <= 1; ++x )
	{
		float2 SampleCoord = UV + float2(x,y) * Step;
		float2 SeedCoord = Tex.SampleLevel(TexSampler, SampleCoord, 0).rg;
		float Dist = length(SeedCoord - UV);

		if(	Dist < BestDistance )
		{
			BestDistance = Dist;
			BestCoord = SeedCoord;
		}
	}
}

return BestCoord;

It’s rather simple, let’s dive into the details :

  • The first two line initialize variables that will be re-used. That’s why they are outside the scope of the loop. The BestDistance is set at a high number because we expect to find a much smaller value when we will be iterating in the loop.
  • The for loops are just a quick way to read all the neighboar pixels as described in the previous parts. That’s why you have two loops imbricated in order to read all the possible positions (corners, sides, top, bottom and central points).
  • SampleCoord defines the position of the pixel we are going to read. The Step variable given outside the custom node defines at which distance.
  • SeedCoord is the result value that we sampled via the input textures.
  • Dist is the distance computation between the current pixel position (UV coordinates) and the neighboard pixel value.
  • The if() compare the distance measure with the previous distance saved.
  • If we found a smaller distance we update the BestDistance and BestCoord variables.
  • Finally we output the best distance found for a given pixel by saving the position inside the texture which will be re-read by the next iteration of the Jump Flood algorithm.

This might be confusing : why outputing the UV coordinates instead of the Distance ? Mostly for two reasons. First is that for each iterations we read the pixel value which contains the nearest pixel coordinates so we need this positon for doing our comparisons. Secondly this information will be more usefull later depending of the effect we want to achieve.
That’s why in the Mask material we wanted to write down a very high value so that it could be noted as “invalid” and therefor overwritten by a more closer value.

Also how many iterations are necessary to get the right result ? This can be determined simply by following the texture resolution. Since we take the half distance at each iteration, it means the last iteration should be the next pixel to the current one. For a 512×512 pixels texture we can compute it with log2(512) which gives us 9 iterations :

  1. 256
  2. 128
  3. 64
  4. 32
  5. 16
  6. 8
  7. 4
  8. 2
  9. 1

The progressive iterations should result to something like this (here slowed down for viewing purpose) :

Dilation via UV advection

Since the Jump Flood computed the nearest pixels and stored the UV coordinates as a result we end-up vu an UV advection texture. That means we we can use that texture as our new UV coordinates to sample our original image and create a dilation of the original pixels :

Here txt_input is the Jump Flood result texture and txt_base is the original image.

So here is what we get, going from left to right : the original image, the Jump Flood result, the UV advection result (dilation) and the combination (with the distance as a mask on top).

Note : the colors may not seem uniform only because the way I extracted the mask is not perfect and result in sampling pixels that blend to black giving me non-uniform colors.

Bibliography

The first two link covers the Jump Flood alorgithm, while the two other are just additional information (especially towards good blur and bloom effects for realtime rendering) :