A better method to recalculate normals in Unity

A visible seam showing after recalculating normals in runtime.

You might have noticed that, for some meshes, calling Unity’s built-in function to   RecalculateNormals(), things look different (i.e. worse) than when calculating them from the import settings. A similar problem appears when recalculating normals after combining meshes, with obvious seams between them. For this post I’m going to show you how RecalculateNormals()  in Unity works and how and why it is very different from Normal calculation on importing a model. Moreover, I will offer you a fast solution that fixes this problem.

This article is also very useful to those who want to generate 3D meshes dynamically during gameplay and not just those who encountered this problem.

Some background…

I’m going to explain some basic concepts as briefly as I can, just to provide some context. Feel free to skip this section if you wish.

Directional Vectors

Directional vectors are not to be confused with point vectors, even though we use exactly the same representation in code.  Vector3(0, 1, 1)  as a point vector simply describes a single point along the X, Y and Z axis respectively. As a directional vector, it describes the direction we have when we stand at the origin point ( Vector3(0, 0, 0) ) and look towards the point  Vector3(0, 1, 1) .

If we draw a line from Vector3(0, 0, 0)  towards Vector3(0, 1, 1) , we will notice that this line has a length of 1.414214 units. This is called the magnitude or length of a vector and it is equal to \sqrt{X^2 + Y^2 + Z^2}. A normalized vector is one that has a magnitude of 1. The normalized version of Vector3(0, 1, 1 ) is Vector3(0, 0.7, 0.7)  and we get that by dividing the vector by its magnitude. When using directional vectors, it is important to keep them normalized (unless you really know what you’re doing), because a lot of mathematical calculations depend on that.

Normals

Normals1Normals are mostly used to calculate the shading of a surface. They are directional vectors that define where a surface is “looking at”, i.e. it is a directional vector that is pointing away from a face (i.e. a surface) made up of three or more vertices (i.e. points). More accurately, normals are actually stored on the vertices themselves instead of the face.  This means that a flat surface of three points actually has three identical normals.

A normal is not to be confused with a normalized vector, although normals are normalized – otherwise, light calculations would look wrong.

How Smoothing works

Smoothing works by averaging the normals of adjacent faces.  What this means is that the normal of each vertex is not the same as that of its face, but rather the average value of the normals of all the faces it belongs to. This also means that, in smooth shading, vertices of the same face do not necessarily have identical normals.

A sphere with flat and smooth shading respectively.
A sphere with flat and smooth shading.

How Unity recalculates normals in runtime

When you call the RecalculateNormals()  method on a mesh in Unity, what happens is very straightforward.

Unity stores mesh information for a list of vertices in a few different arrays. There’s one array for vertex positions, one array for normals, another for UVs, etc. All of these arrays have the same size, and each information of one index in the array represent one single vertex. For example, mesh.vertices[N] and mesh.normals[N] are the position and the normal, respectively, of the Nth vertex in our list of vertices.

However, there is a special array called triangles and it describes the actual faces of the mesh. It is a sequence of integer values, and each integer is the index of a vertex. Each three integers form a single triangle face. This means that the size of this array is always a multiple of 3.

As a side note: Unity assumes that all faces are triangles, so even if you import a model with faces having more than three points (as is the case with the sphere above), they’re automatically converted, upon importing, to multiple smaller faces of three vertices each.

The source code of RecalculateNormals()  is not available, but, guessing from its output, this pseudo-code (sloppily mixed with some real code) follows exactly the same algorithm and produces the same result:

Notice how I don’t explicitly average the normals, but that’s because CalculateSurfaceNormal(...)  is assumed to return a normalized vector. When normalizing a sum of normalized vectors, we get their average value.

How Unity calculates normals while importing

The exact algorithm Unity uses in this case is more complicated than RecalculateNormals() . I can think of three reasons for that:

  1. This algorithm is much slower to use, so Unity avoids calling that during runtime.
  2. Unity has more information while importing.
  3. Unity combines vertices after importing.

The first reason is not the real reason because Unity could have still provided an alternative method to calculate normals in runtime that still performs fast enough. The second and third reasons are actually very similar, since they both come down to one thing: After Unity imports a mesh, it becomes a new entity independent from its source model.

During mesh import, Unity may consider shared vertices among faces to exist multiple times; one time for each face.This means that when importing a cube, which has 8 vertices, Unity actually sees 36 vertices (3 vertices for each of the 2 triangles of each one of the 6 sides).  We will refer to this as the expanded vertex list. However, it can also see the condensed 8 vertex list at the same time. If it can’t, then I’m guessing it first silently builds that structure simply by finding which vertices are at the same position as other vertices.

As you saw in the previous section, smoothing in RecalculateNormals()  only works when vertices at the same position are assumed to be one and the same. With this new data at hand, a different algorithm is to be used. In the following pseudo-code,   vertex  represents a vertex from the condensed list of vertices and vertexPoint  represents a vertex from the expanded list. We also assume that any vertexPoint  has direct access to the  vertex  it is associated to.

As it turns out, the final result contains neither the expanded vertex list nor the condensed vertex list. It is an optimized version which takes all the identical vertices and merges them together. And by identical, I don’t mean just the position and normals; it also also takes into account all the UV coordinates, tangents, bone weights, colors, etc – any information that is stored for a single vertex.

The problem explained

As you may have guessed from the previous section, the problem is that vertices that differ in one or more aspects are considered as completely different vertices in the final mesh that is used during runtime, even if they share the same position. You will most likely encounter this problem when your model has UV coordinates. In fact, the first image of this article was produced precisely by a model with a UV seam, after RecalculateNormals()  was called.

I will not get into many details about UV coordinates, but let me just say that they’re used for texturing, which means it is actually a very common problem; if you ever recalculate normals at runtime then this situation is bound to appear sooner or later.

You can also see this problem if you’re merging two meshes together – say, two opposing halves of a sphere – even if they have identical information. That’s because when merging two meshes, what actually happens is that we simply append the data of one mesh on top of another, which leads common vertices between the two to be distinct in the final mesh.

The technical cause behind the problem is, as we have pointed out, that RecalculateNormals()  does not know which distinct vertices in our list have the same position as others. Unity knows this while importing, but this information is now lost.

This is also the reason RecalculateNormals()  does not take any angle threshold as a parameter, contrary to calculating normals during import time. If I import a model with a 0° tolerance, then it will be impossible to have any smooth shading during run-time.

My solution

My solution to this problem is not very simple, but I have provided the full source code. It is somewhat of a hybrid solution between the two extremes used by Unity. I use hashing to cluster vertices that exist in the same position to achieve a nearly linear complexity (based on the number of vertices in the expanded list) to avoid a brute-force approach. Therefore, it is fast and asymptotically optimal.

Usage

By adding my code to your project (found at the end of this article), you will be able to recalculate normals based on an angle threshold:

As you probably know, comparing equality between two floating numbers is a bad practice, because of the inherent imprecision of this data type. The usual approach is to check whether the floats have a very small difference between them (say, 0.0001), but this is not enough in this case since I need to produce a hash key as well. I use a different approach that I loosely call “digit tolerance”, which compares floats by converting them to long integers after multiplying them with a power of 10.  This method makes it very easy for Vector3 values to have identical hash codes if they’re identical within a given tolerance.

My implementation multiplies with the constant 100,000 (for 5 decimal digits), which means that 4.000002 is equal to 4.00000. The tolerance is rounded, so 4.000008 is equal to 4.00001. If we use a number that is either too small or too large then we will probably get wrong results. 100,000 is a good number, you will rarely need to change that. Feel free to do so if you think it’s better.

Things to be aware of

One thing to keep in mind is that my solution might smoothen normals even with an angle higher than the threshold; this only happens when we try to smoothen with an angle less than the one specified by the import settings. Don’t worry though, it is still possible to get the right result.

To explain why this happens, assume that two faces are adjacent with a 45° angle between their normals. Upon import with a 60° tolerance, their common vertices were merged. When we try to recalculate normals at runtime using an angle tolerance of 30°,  those vertices remain merged and so their faces will still have smoothing regardless of the angles between them!

I could have written an alternative method that splits those vertices at runtime, or one that recreates a flat mesh and then merges identical vertices after smoothing. However, I thought that this was much bigger trouble than what it was worth and it would be a much slower process.

To achieve a better result, you can import a mesh with a 0° tolerance. This will allow you to smoothen normals at any angle during runtime. However, since a 0° tolerance produces a larger model, you can alternatively just import a mesh with the minimum tolerance you’re ever going to need. If you import at 30°, you can still smoothen correctly at runtime for any degree that is higher than 30°.

In any case, a 60° tolerance is good for most applications. You can use that for both import and runtime normal calculation.

The source code

Happy smoothing!

Smoothie

19 thoughts on “A better method to recalculate normals in Unity”

  1. Hi Charis, really good post, thanks for the explanation and code.

    I am seeing incorrect shading on my dynamic mesh and I have tried this method to resolve my issue but it doesn’t seem to have fixed it. For each quad (two triangles) my lighting doesn’t seem to blend in with its neighbours:

    http://www.spannerworx.com/images/seams.png

    I am using just one White directional light and the default diffuse material.

    Do you have any ideas what I could try?

    Thanks

    1. Hello, I’m sorry for the late reply.

      Have you fixed your issue yet? What do you get when you call the default RecalculateNormals() method? If you’re talking about the seam, that might be caused because you have two meshes and not one. This code will only work for one mesh only.

      If you’re not talking about the seam but the smoothness of your triangles, that actually seems correct. The triangles ARE smooth, it’s just that they’re very big and it’s what you’d get with a default shader. Try smooth-subdividing your dynamic mesh to increase poly-count. I don’t think Unity has built-in methods to do that, but you might be able to find an algorithm for that online.

  2. Hey there, excuse my ignorance but I cant get this to work… when i add the script RecalculateNormals still shows the summary for the Unity version in mono- and if I add a variable i.e. RecalculateNormals(60) I get an error about overloaded methods…- do I have to call it some other way?

  3. So this is great- leagues ahead of the standard function, and very useful in lots of scenarios, even if its slower (like when you are allowing character customization for example).

    One question though, in the Unity manual for RecalculateNormals (http://docs.unity3d.com/ScriptReference/Mesh.RecalculateNormals.html) it says ‘Also note that RecalculateNormals does not generate tangents automatically thus bumpmap shaders will not work’ – again excuse my ignorance of the jargon here, but does that mean that if I have a material that uses the ‘Standard Shader’ and I have a normal map in there, that after you call this, the normal map wont have any effect?

    Also I really think that UMA (an asset that allows you to chreate characters from mixed meshes at runtime and customise them -see here http://forum.unity3d.com/threads/uma-unity-multipurpose-avatar-on-the-asset-store.219175 and on the asset store) could do with some help with the sort of thing you are into here. I have noticed that if you adjust a UMA character (which they do using bone deformation) then the lighting is wrong. I have been trying to use your script to recalculate the normals after the deformation, but not done so successfully yet… But the UMA project is very focused on creating multi use Armour that characters can wear, and obviously for that having shiny plate armour that reflects light properly would make it look waaaay cooler.

    pop over if you have a spare minute 🙂

    1. Hi, I thought I replied, but apparently I didn’t.

      This method does not mess with tangents, so if your model might still work if it already has tangents. But, If you combine two meshes together, you might be able to also take their tangents and merge them as well, although I’m not sure what will happen at the seams.

      In any case, you can always recalculate tangents as well, and there is code for that online. I’ll investigate when I have more time and maybe write some other tangent/bitangent method that can work with this one.

  4. It’s possible to apply this thing to the Skinned Mesh? I tried to did it but seems it doesn’t work. After recalculation my mesh just becomes black. /:

    1. I’ve never tried it with a Skinned Mesh before. If you get a black shape it probably means that the normals are somehow reset to {0, 0, 0}, but this shouldn’t be happening. Did you try other Skinned Meshes?

  5. sorry for the nube question here, but I have some models I’m trying to setup for webgl, and I am having normal smoothing issues that seem match what you are talking about here. However, being a nube, I don’t know how to implement your script to fix the meshes that have smoothing issues at runtime. Can you just give me a quick step by step on how to use the scripts you have provided?
    Thanks!

  6. Thank You so much for this article, it was a perfect read in every way. You have answered even questions I did expect to find answers for. Keep up the good work.

Leave a Reply