Physically based shading related code has been refactored in
Unity 5.5 (files UnityStandardBRDF.cginc
and so on). In most
cases this does not affect your shaderA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
See in Glossary code directly, unless
you are manually calling some functions directly. Notable
changes are listed below.
There are now functions to convert between smoothness,
roughness and perceptual roughness:
PerceptualRoughnessToRoughness
,
RoughnessToPerceptualRoughness
, SmoothnessToRoughness
,
RoughnessToSmoothness
.
The visibility term in UnityStandardBRDF.cginc
takes roughness
and not perceptualRoughness
.
In older versions of Unity, it was possible to do a remapping with Marmoset roughness. With the move to GGX it no longer matches, and UNITY_GLOSS_MATCHES_MARMOSET_TOOLBAG2
definition and associated code has been removed.
All reads and writes into the Gbuffer should go through new functions UnityStandardDataToGbuffer
/ UnityStandardDataFromGbuffer
.
Your shader code should call UnityGlossyEnvironmentSetup()
to initialize a Unity_GlossyEnvironmentData
struct instead of doing it manually.
The roughness
variable of Unity_GlossyEnvironmentData
is actually “perceptual roughness” but it hasn’t been renamed to avoid errors with existing shader code. Note: UnityGlossyEnvironmentSetup
takes smoothness
as a parameter and calculates perceptual roughness.
The ndotl
variable value in UnityLight
is now calculated on the fly and any value written into the variable is ignored.
The UNITY_GI
macro is deprecated and should not be used anymore.
Unity 5.5 now handles DX9 half-pixel offset rasterizationThe process of generating an image by calculating pixels for each polygon or triangle in all the geometry. This is an alternative to ray tracing.
See in Glossary in the background, which means you no longer need to fix DX9 half-pixel issues either in shaders or in code. See more details in this blog post. If you use any of these checks in your code, they can now be removed:
The way Unity solves this now is by inserting half-pixel adjustment code into all vertex shadersA program that runs on each vertex of a 3D model when the model is being rendered. More info
See in Glossary that are being compiled. As a result, vertex shader constant register c255 becomes reserved by Unity, as well as two instructions being added to all shaders, and one more temporary register is used. This should not create problems, unless your vertex shaders use up all the available resources (constant/temporary registers and instruction slots) to the absolute maximum.
The Z-buffer (depth buffer) direction has been inverted and this means the Z-buffer will now contain 1.0 at the near plane, 0.0 at the far plane. This, combined with the floating point depth bufferA memory store that holds the z-value depth of each pixel in an image, where the z-value is the depth for each rendered pixel from the projection plane. More info
See in Glossary significantly increases the depth buffer precision resulting in less Z-fighting and better shadows, especially when using small near planes and large far planes.
Graphics API changes:
The following macros/functions will handle reversed Z situations without any other steps. If your shader was already using them, then no changes needed from your side:
However if you are fetching the Z buffer value manually you will need to do write code similar to:
float z = tex2D(_CameraDepthTexture, uv);
#if defined(UNITY_REVERSED_Z)
z = 1.0f - z;
#endif
For clip space depth you can use the following macro. Please note that this macro will not alter clip space on OpenGL/ES plaforms but will remain [-near, far]
:
float clipSpaceRange01 = UNITY_Z_0_FAR_FROM_CLIPSPACE(rawClipSpace);
_ZBufferParams now contains these values on platforms with a reversed depth buffer. See documentation on platform-specific rendering differences for more information.
x = -1+far/near
y = 1
z = x/far
w = 1/far
Z-bias is handled automatically by Unity but if you are using a native code renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary plugin you will need to negate it in your C/C++ code on matching platforms.
All subfolders of the folder named “Editor” will be excluded from the build and will not load in Play mode in the Unity Editor. Previously a subfolder named “Resources” would have its assetsAny media or data that can be used in your game or Project. An asset may come from a file created outside of Unity, such as a 3D model, an audio file or an image. You can also create some asset types in Unity, such as an Animator Controller, an Audio Mixer or a Render Texture. More info
See in Glossary included in the build. These assets are still loadable by calling Resources.Load() in your Editor scriptsA piece of code that allows you to create your own Components, trigger game events, modify Component properties over time and respond to user input in any way you like. More info
See in Glossary.
For example, these files are excluded from the build and will not load when in Play mode in the Editor, but will load if called from scripts:
These assets will load in all situations:
Previously the ‘Backface Tolerance’ parameter in Lightmap Parameters was not applied when using final gather for baked GI. It is now applied correctly. The parameter now affects both the realtime GI and baked GI stages (including the final gather).
Affected scenesA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary are mainly ones with single sided geometry (like billboards) where it is important to be able to adjust the ‘Backface Tolerance’ in order to avoid invalidating texels that are seeing the backface of single sided geometry. In scenes that use billboardsA textured 2D object that rotates as it, or the Camera, moves so that it always faces the Camera. See Billboard Renderer
See in Glossary and final gather the lightmapsA pre-rendered texture that contains the effects of light sources on static objects in the scene. Lightmaps are overlaid on top of scene geometry to create the effect of lighting. More info
See in Glossary can now be improved by adjusting ‘Backface Tolerance’, however other scenes might also be affected, if a non-default ‘Backface Tolerance’ is applied, since it is now correctly accounted for in the final gather stage.
BRDF2, the standard shaderA built-in shader for rendering real-world objects such as stone, wood, glass, plastic and metal. Supports a wide range of shader types and combinations. More info
See in Glossary type set on mobile platforms by default, now uses GGX approximation instead of Blinn-Phong. This makes it look closer to BRDF1 (used on desktops by default) and improves on visual quality.
Should you need to preserve legacy approximation you should modify the BRDF2 code in UnityStandardBRDF.cginc which has the new implementation inside the #if UNITY_BRDF_GGX statement (this is also used by BRDF1 to pick GGX). Change the definition in UnityStandardConfig.cginc or change #if UNITY_BRDF_GGX to #if 0 in the BRDF2_Unity_PBS function.
You can now use GradleAn Android build system that automates several build processes. This automation means that many common build errors are less likely to occur. More info
See in Glossary to build for Android.
Gradle is not as strict about errors compared with the existing Unity Android build system, meaning that some existing projects may be hard to convert to Gradle. See documentation on Gradle troubleshooting to identify and solve these build failures.
The specific overload of the Instantiate function that by default, takes a parameter for the original GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary and one for a parent Transform has been changed to work differently. It no longer interprets the original GameObject’s position and rotation as a world space position and rotation, thus ignoring the position and rotation of the specified parent Transform.
It now interprets the position and rotation as a local position and rotation within the space of the specified parent Transform, by default. This is consistent with behavior in the Editor. Your scripts will not be automatically updated. This means when you run scripts containing calls to this overload of Instantiate that have not been updated to account for this change, you may experience unexpected behavior.
Disabling a LOD GroupA component to manage level of detail (LOD) for GameObjects. More info
See in Glossary component no longer disables the Renderers attached to it. The LOD Group settings only apply to the Renderers when the LOD Group component is enabled. Unity automatically applies this change when you upgrade your project, and the change cannot be reverted.