Pixelate Filter in HDRP using Compute Shaders

HDRP Pixelate Example

In this article, we’ll complete the trifecta and port our Pixelate image filter to HDRP. If you haven’t yet, I recommend reading the first post first to understand the full context. This post will focus on the changes required to make the compute shader filter work in HDRP, but we won’t focus on the compute shader itself.

CustomPostProcessVolumeComponent

For this HDRP version, we’ll use a Custom Post-Process Volume Component. This is HDRP’s system for extending the renderer with more post-processing. HDRP also supports custom rendering passes, but that’s intended to add new ways to render objects rather than fullscreen effects.

The CustomPostProcessVolumeComponent has a Render method that we can override. The Render method receives a CommandBuffer, an HDCamera, and two render texture handles for the source and destination. So, we’ll set up the Pixelate Compute Shader variables and dispatch the shader. By the way, if you’ve been following along with the series, it’s important to mention that I’ve modified the Pixelate Compute Shader in this version. For one, I’ve added another RWTexture to hold the source texture. As a result, we can read from the source and write to the destination. This way saves us from having to blit from the source to the destination before dispatching the filter.

Additionally, there are some HDRP macros to declare RWTextures and index into them as well. Without these macros, the shader won’t run in DirectX. So here’s the modified Pixelate.compute file; notice the use of the RW_TEXTURE2D_X and the COORD_TEXTURE2D_X macros.

#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/ShaderVariables.hlsl"

#pragma kernel Pixelate

RW_TEXTURE2D_X(float4, _ImageFilterSource);
RW_TEXTURE2D_X(float4, _ImageFilterResult);

int _BlockSize;
uint _ResultWidth;
uint _ResultHeight;

[numthreads(8,8,1)]
void Pixelate (uint3 id : SV_DispatchThreadID)
{
    if (id.x >= _ResultWidth || id.y >= _ResultHeight)
        return;

    const uint2 startPos = id.xy * _BlockSize;
    
    if (startPos.x >= _ResultWidth || startPos.y >= _ResultHeight)
        return;
    
    const int blockWidth = min(_BlockSize, _ResultWidth - startPos.x);
    const int blockHeight = min(_BlockSize, _ResultHeight - startPos.y);
    const int numPixels = blockHeight * blockWidth;
    
    float4 colour = float4(0, 0, 0, 0);
    for (int x = 0; x < blockWidth; ++x)
    {
        for (int y = 0; y < blockHeight; ++y)
        {
            const uint2 pixelPos = uint2(startPos.x + x, startPos.y + y);
            colour += _ImageFilterSource[COORD_TEXTURE2D_X(pixelPos)];
        }
    }
    colour /= numPixels;

    for (int i = 0; i < blockWidth; ++i)
    {
        for (int j = 0; j < blockHeight; ++j)
        {
            const uint2 pixelPos = uint2(startPos.x + i, startPos.y + j);
            _ImageFilterResult[COORD_TEXTURE2D_X(pixelPos)] = colour;
        }
    }
}

With that context, here’s the Render method from the CustomPostProcessVolumeComponent.

public override void Render(CommandBuffer cmd, HDCamera camera, RTHandle source, RTHandle destination)
{
    var mainKernel = FilterComputeShader.FindKernel("Pixelate");
    FilterComputeShader.GetKernelThreadGroupSizes(mainKernel, out uint xGroupSize, out uint yGroupSize, out _);
    cmd.SetComputeTextureParam(FilterComputeShader, mainKernel, "_ImageFilterSource", source.nameID);
    cmd.SetComputeTextureParam(FilterComputeShader, mainKernel, "_ImageFilterResult", destination.nameID);
    cmd.SetComputeIntParam(FilterComputeShader, "_BlockSize", BlockSize.value);
    cmd.SetComputeIntParam(FilterComputeShader, "_ResultWidth", destination.rt.width);
    cmd.SetComputeIntParam(FilterComputeShader, "_ResultHeight", destination.rt.height);
    cmd.DispatchCompute(FilterComputeShader, mainKernel,
        Mathf.CeilToInt(destination.rt.width / (float) BlockSize.value / xGroupSize),
        Mathf.CeilToInt(destination.rt.height / (float) BlockSize.value / yGroupSize),
        1);
}

As you can see, we sequence the CommandBuffer with a series of commands to set all the Compute Shader variables and dispatch the shader. There’s no other boilerplate necessary this time. However, there are a few adjustable parameters that we should expose in the inspector, so we’ll cover that next.

Setting Parameters

A CustomPostProcessVolumeComponent can have public parameters that appear in the inspector. However, we must wrap them in a VolumeParameter class; otherwise, they won’t show up. The reason we wrap the parameters is so that the system can handle enabling and disabling effect overrides. Additionally, it forces us to set a reasonable default value. We’ll add an int to control the pixel block size, a bool to toggle the effect in the scene view, and a ComputeShader field. There are built-in classes for IntParameter and BoolParameter, so that’s easy. By the way, CustomPostProcessVolumeComponent has an overrideable property to control whether it’s visible in the scene view. So let’s connect that as well.

public ClampedIntParameter BlockSize = new ClampedIntParameter(5, 2, 20);
public BoolParameter ShowInSceneView = new BoolParameter(false);

public override bool visibleInSceneView => ShowInSceneView.value;

Custom Parameters

To supply a ComputeShader, we need a new class that inherits from VolumeParameter. Let’s create the class and write a constructor.

using UnityEngine;
using UnityEngine.Rendering;

[System.Serializable]
public class ComputeShaderParameter : VolumeParameter<ComputeShader>
{
    public ComputeShaderParameter(ComputeShader value, bool overrideState = false)
        : base(value, overrideState)
    {
    }
}

Now jump back to the post-process volume and add another parameter. We also want to disable the effect if there’s no supplied compute shader. To do this, we can add an IsActive method which comes from inheriting IPostProcessComponent.

public ComputeShaderParameter FilterComputeShaderParameter = new ComputeShaderParameter(null);

public bool IsActive() => FilterComputeShaderParameter.value != null;

Finally, we must specify the injection point, that is, the point in the render pipeline when this effect will run. To do this, we override the property called injectionPoint. In our case, we want to run last, so we’ll use AfterPostProcess.

public override CustomPostProcessInjectionPoint injectionPoint => CustomPostProcessInjectionPoint.AfterPostProcess;

This completes the image filter code. However, even if you create a Volume in your scene and add the effect, you won’t see it. That’s because we have to add it to the Project Settings as well. Open Edit > Project Settings, select the tab HDRP Default Settings, then scroll down to Custom Post Process Orders. Here we can add our effect to the desired injection point, After Post Process. Now, you can add the effect to a volume in your scene and bask in the high def fuzzy pixel glory. Try automating the BlockSize to make amazing pixel wipe effects.

Here’s the entire image filter code.

using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.HighDefinition;
using System;

[Serializable, VolumeComponentMenu("Post-processing/Custom/PixelateImageFilter")]
public sealed class PixelateImageFilter : CustomPostProcessVolumeComponent, IPostProcessComponent
{
    public ClampedIntParameter BlockSize = new ClampedIntParameter(5, 2, 20);
    public BoolParameter ShowInSceneView = new BoolParameter(false);
    public ComputeShaderParameter FilterComputeShaderParameter = new ComputeShaderParameter(null);
    
    public bool IsActive() => FilterComputeShaderParameter.value != null;

    public override bool visibleInSceneView => ShowInSceneView.value;

    public override CustomPostProcessInjectionPoint injectionPoint => CustomPostProcessInjectionPoint.AfterPostProcess;

    public override void Render(CommandBuffer cmd, HDCamera camera, RTHandle source, RTHandle destination)
    {
        var filterComputeShader = FilterComputeShaderParameter.value;
        var mainKernel = filterComputeShader.FindKernel("Pixelate");
        filterComputeShader.GetKernelThreadGroupSizes(mainKernel, out uint xGroupSize, out uint yGroupSize, out _);
        cmd.SetComputeTextureParam(filterComputeShader, mainKernel, "_ImageFilterSource", source.nameID);
        cmd.SetComputeTextureParam(filterComputeShader, mainKernel, "_ImageFilterResult", destination.nameID);
        cmd.SetComputeIntParam(filterComputeShader, "_BlockSize", BlockSize.value);
        cmd.SetComputeIntParam(filterComputeShader, "_ResultWidth", destination.rt.width);
        cmd.SetComputeIntParam(filterComputeShader, "_ResultHeight", destination.rt.height);
        cmd.DispatchCompute(filterComputeShader, mainKernel,
            Mathf.CeilToInt(destination.rt.width / (float) BlockSize.value / xGroupSize),
            Mathf.CeilToInt(destination.rt.height / (float) BlockSize.value / yGroupSize),
            1);
    }
}

So we’ve finally completed the trifecta. We can now write Compute Shader post-processing effects across all through rendering pipelines. Unfortunately, we had to use HDRP macros in this version, so the shader is no longer entirely cross-compatible. If you find yourself in a position where you need that compatibility, I suggest implementing your own macros to branch according to the renderer you’re using.

Check out the project here on GitHub. If you like my work, consider joining my mailing list, and I’ll email you whenever a new post is released.

5 thoughts on “Pixelate Filter in HDRP using Compute Shaders

  1. Shannon

    Really enjoy reading these. Great job!

    1. bronson

      Thanks for saying that! It’s nice to know that somebody is both reading and enjoying them. :’)

      1. Void

        i don’t always ctrl+p a page, but this is hard to find content thankyou

  2. plateface

    How do you make things filter? GameObject Layer Filter in HDRP

    1. bronson

      I’ll have to look into it

Leave A Comment