The example on this page describes how to create a custom Renderer Feature that performs a full screen blitA shorthand term for “bit block transfer”. A blit operation is the process of transferring blocks of data from one place in memory to another.
See in Glossary.
Note: Unity no longer develops or improves the rendering path that doesn’t use the render graph API. Use the render graph API instead when developing new graphics features. To use the instructions on this page, enable Compatibility Mode (Render Graph Disabled) in URP graphics settings (Project Settings > Graphics).
This example implements the following solution:
A custom Renderer Feature calls a custom Render Pass.
The Render Pass blits the Opaque Texture to the the Camera color target for the current renderer. The render pass uses the command buffer to draw a full screen meshThe main graphics primitive of Unity. Meshes make up a large part of your 3D worlds. Unity supports triangulated or Quadrangulated polygon meshes. Nurbs, Nurms, Subdiv surfaces must be converted to polygons. More info
See in Glossary for both eyes.
The example includes the shader that performs the GPU side of the rendering. The shaderA program that runs on the GPU. More info
See in Glossary samples the color buffer using XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary sampler macros.
This example requires the following:
To follow the steps in this example, create a new sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary with the following GameObjectsThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary:
Create a Cube. Ensure that the Cube is clearly visible from the main CameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary.
Now you have the scene necessary to follow the steps in this example.
This section assumes that you created a scene as described in section Create example scene and GameObjects.
Follow these steps to create a custom Renderer Feature with a custom Render Pass.
Create a new C# script. Call it ColorBlitRendererFeature.cs
. This script implements the custom Renderer Feature.
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
internal class ColorBlitRendererFeature : ScriptableRendererFeature
{
public Shader shader;
[Range(0f, 1f)] public float intensity = 1f;
Material m_Material;
ColorBlitPass m_RenderPass;
// Use this method to add one or more Render Passes into the rendering sequence of the renderer with the EnqueuePass method.
public override void AddRenderPasses(ScriptableRenderer renderer,
ref RenderingData renderingData)
{
if (m_Material == null)
return;
if (renderingData.cameraData.cameraType != CameraType.Game)
return;
m_RenderPass.SetIntensity(intensity);
renderer.EnqueuePass(m_RenderPass);
}
// Use the Create method to initialize the resources. This method only runs in the first frame.
// Use the Dispose method to clean up the resources before the renderer feature is disabled.
public override void Create()
{
m_Material = CoreUtils.CreateEngineMaterial(shader);
m_RenderPass = new ColorBlitPass(m_Material);
}
protected override void Dispose(bool disposing)
{
CoreUtils.Destroy(m_Material);
}
}
Create a new C# script. Call it ColorBlitPass.cs
. This script implements the custom Render Pass that performs the custom blit draw call.
This render pass uses the AddBlitPass to perform the blit operation.
Note: Do not use the
cmd.Blit
method in URP XR projects because that method has compatibility issues with the URP XR integration. Usingcmd.Blit
might implicitly enable or disable XR shader keywords, which breaks XR SPI rendering.
using UnityEngine;
using UnityEngine.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
using UnityEngine.Rendering.RenderGraphModule.Util;
public class ColorBlitPass : ScriptableRenderPass
{
private const string k_PassName = "ColorBlitPass";
private Material m_Material;
private float m_Intensity;
private static readonly int k_IntensityID = Shader.PropertyToID("_Intensity");
public ColorBlitPass(Material mat)
{
m_Material = mat;
renderPassEvent = RenderPassEvent.BeforeRenderingPostProcessing;
}
public void SetIntensity(float intensity)
{
m_Intensity = intensity;
}
// Use the RecordRenderGraph method to configure the input and output parameters for the AddBlitPass method and execute the AddBlitPass method.
public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
{
var resourceData = frameData.Get<UniversalResourceData>();
// The following line ensures that the render pass doesn't blit
// from the back buffer.
if (resourceData.isActiveTargetBackBuffer)
{
Debug.LogError($"Skipping render pass. ColorBlitRendererFeature requires an intermediate ColorTexture, we can't use the BackBuffer as a texture input.");
return;
}
var source = resourceData.activeColorTexture;
// Define the texture descriptor for creating the destination render graph texture.
var destinationDesc = renderGraph.GetTextureDesc(source);
destinationDesc.name = $"CameraColor-{k_PassName}";
destinationDesc.clearBuffer = false;
destinationDesc.depthBufferBits = 0;
// Create the texture.
TextureHandle destination = renderGraph.CreateTexture(destinationDesc);
// The AddBlitPass method adds the render graph pass that blits from the source to the destination texture.
RenderGraphUtils.BlitMaterialParameters para = new(source, destination, m_Material, 0);
para.material.SetFloat(k_IntensityID, m_Intensity);
renderGraph.AddBlitPass(para, passName: k_PassName);
// Use the destination texture as the camera texture to avoid the extra blit from the destination texture back to the camera texture.
resourceData.cameraColor = destination;
}
}
Create the shader that performs the blit operation. Call the shader file ColorBlit.shader
. The vertex function outputs the full-screen quad position. The fragment function samples the color buffer and returns the color * float4(0, _Intensity, 0, 1)
value to the render target.
Shader "ColorBlit"
{
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
ZWrite Off Cull Off
Pass
{
Name "ColorBlitPass"
HLSLPROGRAM
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
#pragma vertex Vert
#pragma fragment Frag
float _Intensity;
float4 Frag(Varyings input) : SV_Target
{
// This function handles the different ways XR platforms handle texture arrays.
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
// Sample the texture using the SAMPLE_TEXTURE2D_X_LOD function
float2 uv = input.texcoord.xy;
half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
// Modify the sampled color
return half4(0, _Intensity, 0, 1) * color;
}
ENDHLSL
}
}
}
Add the ColorBlitRendererFeature
to the Universal Renderer asset.
For information on how to add a Renderer Feature, refer to the page How to add a Renderer Feature to a Renderer.
For this example, set the Intensity property to 1.5.
Unity shows the following views:
Note: To visualize the example in XR, configure the project to use XR SDK. Add the MockHMD XR Plugin to the project. Set the Render Mode property to Single Pass Instanced.
The example is complete.
For more information on performing the blit operation in Compatibility Mode, refer to the Using textures section in the URP 14 (Unity 2022) documentation.