You might have noticed that, for some meshes, calling Unity’s built-in function to RecalculateNormals(), things look different (i.e. worse) than when calculating them from the import settings. A similar problem appears when recalculating normals after combining meshes, with obvious seams between them. For this post I’m going to show you how RecalculateNormals() in Unity works and how and why it is very different from Normal calculation on importing a model. Moreover, I will offer you a fast solution that fixes this problem.

This article is also very useful to those who want to generate 3D meshes dynamically during gameplay and not just those who encountered this problem.

# Some background…

I’m going to explain some basic concepts as briefly as I can, just to provide some context. Feel free to skip this section if you wish.

##### Directional Vectors

Directional vectors are not to be confused with point vectors, even though we use exactly the same representation in code.
Vector3(0, 1, 1) as a *point* vector simply describes a single point along the X, Y and Z axis respectively. As a *directional* vector, it describes the direction we have when we stand at the origin point (
Vector3(0, 0, 0) ) and look towards the point
Vector3(0, 1, 1) .

If we draw a line from
Vector3(0, 0, 0) towards
Vector3(0, 1, 1) , we will notice that this line has a length of 1.414214 units. This is called the magnitude or length of a vector and it is equal to . A *normalized* vector is one that has a magnitude of 1. The normalized version of
Vector3(0, 1, 1 ) is
Vector3(0, 0.7, 0.7) and we get that by dividing the vector by its magnitude. When using directional vectors, it is important to keep them normalized (unless you really know what you’re doing), because a lot of mathematical calculations depend on that.

##### Normals

Normals are mostly used to calculate the shading of a surface. They are directional vectors that define where a surface is “looking at”, i.e. it is a directional vector that is pointing away from a face (i.e. a surface) made up of three or more vertices (i.e. points). More accurately, normals are actually stored on the vertices themselves instead of the face. This means that a flat surface of three points actually has three identical normals.

A normal is not to be confused with a normalized vector, although normals *are* normalized – otherwise, light calculations would look wrong.

##### How Smoothing works

Smoothing works by averaging the normals of adjacent faces. What this means is that the normal of each vertex is not the same as that of its face, but rather the average value of the normals of all the faces it belongs to. This also means that, in smooth shading, vertices of the same face do not necessarily have identical normals.

# How Unity recalculates normals in runtime

When you call the RecalculateNormals() method on a mesh in Unity, what happens is very straightforward.

Unity stores mesh information for a list of vertices in a few different arrays. There’s one array for vertex positions, one array for normals, another for UVs, etc. All of these arrays have the same size, and each information of one index in the array represent one single vertex. For example, mesh.vertices[N] and mesh.normals[N] are the position and the normal, respectively, of the Nth vertex in our list of vertices.

However, there is a special array called triangles and it describes the actual faces of the mesh. It is a sequence of integer values, and each integer is the index of a vertex. Each three integers form a single triangle face. This means that the size of this array is always a multiple of 3.

*As a side note: Unity assumes that all faces are triangles, so even if you import a model with faces having more than three points (as is the case with the sphere above), they’re automatically converted, upon importing, to multiple smaller faces of three vertices each.*

The source code of RecalculateNormals() is not available, but, guessing from its output, this pseudo-code (sloppily mixed with some real code) follows exactly the same algorithm and produces the same result:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
initialize all normals to (0, 0, 0) foreach three indices in triangles list of mesh: normal = CalculateSurfaceNormal( mesh.vertices[index1], mesh.vertices[index2], mesh.vertices[index3]) mesh.normals[index1] += normal mesh.normals[index2] += normal mesh.normals[index3] += normal foreach vertex in mesh: normalize vertex.normal } |

Notice how I don’t explicitly average the normals, but that’s because CalculateSurfaceNormal(...) is assumed to return a normalized vector. When normalizing a sum of normalized vectors, we get their average value.

# How Unity calculates normals while importing

The exact algorithm Unity uses in this case is more complicated than RecalculateNormals() . I can think of three reasons for that:

- This algorithm is much slower to use, so Unity avoids calling that during runtime.
- Unity has more information while importing.
- Unity combines vertices after importing.

The first reason is not the real reason because Unity could have still provided an alternative method to calculate normals in runtime that still performs fast enough. The second and third reasons are actually very similar, since they both come down to one thing: After Unity imports a mesh, it becomes a new entity independent from its source model.

During mesh import, Unity may consider shared vertices among faces to exist multiple times; one time for each face.This means that when importing a cube, which has 8 vertices, Unity actually sees 36 vertices (3 vertices for each of the 2 triangles of each one of the 6 sides). We will refer to this as the expanded vertex list. However, it can also see the condensed 8 vertex list at the same time. If it can’t, then I’m guessing it first silently builds that structure simply by finding which vertices are at the same position as other vertices.

As you saw in the previous section, smoothing in RecalculateNormals() only works when vertices at the same position are assumed to be one and the same. With this new data at hand, a different algorithm is to be used. In the following pseudo-code, vertex represents a vertex from the condensed list of vertices and vertexPoint represents a vertex from the expanded list. We also assume that any vertexPoint has direct access to the vertex it is associated to.

1 2 3 4 5 6 7 8 9 10 11 |
threshold = ? initialize all vertexPoint normals to (0, 0, 0) foreach vertexPoint in mesh: foreach testFace in vertexPoint.vertex.faceList: if angle between vertexPoint.face.normal and testFace.normal < threshold vertexPoint.normal += testFace.normal foreach vertexPoint in mesh: normalize vertexPoint.normal |

As it turns out, the final result contains neither the expanded vertex list nor the condensed vertex list. It is an optimized version which takes all the identical vertices and merges them together. And by identical, I don’t mean just the position and normals; it also also takes into account all the UV coordinates, tangents, bone weights, colors, etc – any information that is stored for a single vertex.

# The problem explained

As you may have guessed from the previous section, the problem is that vertices that differ in one or more aspects are considered as completely different vertices in the final mesh that is used during runtime, even if they share the same position. You will most likely encounter this problem when your model has UV coordinates. In fact, the first image of this article was produced precisely by a model with a UV seam, after RecalculateNormals() was called.

I will not get into many details about UV coordinates, but let me just say that they’re used for texturing, which means it is actually a very common problem; if you ever recalculate normals at runtime then this situation is bound to appear sooner or later.

You can also see this problem if you’re merging two meshes together – say, two opposing halves of a sphere – even if they have identical information. That’s because when merging two meshes, what actually happens is that we simply append the data of one mesh on top of another, which leads common vertices between the two to be distinct in the final mesh.

The technical cause behind the problem is, as we have pointed out, that RecalculateNormals() does not know which distinct vertices in our list have the same position as others. Unity knows this while importing, but this information is now lost.

This is also the reason RecalculateNormals() does not take any angle threshold as a parameter, contrary to calculating normals during import time. If I import a model with a 0° tolerance, then it will be impossible to have any smooth shading during run-time.

# My solution

My solution to this problem is not very simple, but I have provided the full source code. It is somewhat of a hybrid solution between the two extremes used by Unity. I use hashing to cluster vertices that exist in the same position to achieve a nearly linear complexity (based on the number of vertices in the expanded list) to avoid a brute-force approach. Therefore, it is fast and asymptotically optimal.

##### Usage

By adding my code to your project (found at the end of this article), you will be able to recalculate normals based on an angle threshold:

1 2 |
var mesh = GetComponentInChildren().mesh; mesh.RecalculateNormals(60); |

As you probably know, comparing equality between two floating numbers is a bad practice, because of the inherent imprecision of this data type. The usual approach is to check whether the floats have a very small difference between them (say, 0.0001), but this is not enough in this case since I need to produce a hash key as well. I use a different approach that I loosely call “digit tolerance”, which compares floats by converting them to long integers after multiplying them with a power of 10. This method makes it very easy for Vector3 values to have identical hash codes if they’re identical within a given tolerance.

My implementation multiplies with the constant 100,000 (for 5 decimal digits), which means that 4.000002 is equal to 4.00000. The tolerance is rounded, so 4.000008 is equal to 4.00001. If we use a number that is either too small or too large then we will probably get wrong results. 100,000 is a good number, you will rarely need to change that. Feel free to do so if you think it’s better.

##### Things to be aware of

One thing to keep in mind is that my solution might smoothen normals even with an angle higher than the threshold; this only happens when we try to smoothen with an angle *less* than the one specified by the import settings. Don’t worry though, it is still possible to get the right result.

To explain why this happens, assume that two faces are adjacent with a 45° angle between their normals. Upon import with a 60° tolerance, their common vertices were merged. When we try to recalculate normals at runtime using an angle tolerance of 30°, those vertices remain merged and so their faces will still have smoothing regardless of the angles between them!

I could have written an alternative method that splits those vertices at runtime, or one that recreates a flat mesh and then merges identical vertices after smoothing. However, I thought that this was much bigger trouble than what it was worth and it would be a much slower process.

To achieve a better result, you can import a mesh with a 0° tolerance. This will allow you to smoothen normals at any angle during runtime. However, since a 0° tolerance produces a larger model, you can alternatively just import a mesh with the minimum tolerance you’re ever going to need. If you import at 30°, you can still smoothen correctly at runtime for any degree that is higher than 30°.

In any case, a 60° tolerance is good for most applications. You can use that for both import and runtime normal calculation.

##### The source code

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 |
/** * The following code was taken from: http://schemingdeveloper.com * * Visit our game studio website: http://stopthegnomes.com * * License: You may use this code however you see fit, as long as you give credit when * explicitly asked and as long as you include this notice without any modifications. * * You may not publish a paid asset on Unity store if its main function is based on * the following code, but you may publish a paid asset that uses this as part of a * larger suite. You may still publish a free asset whose main function is using this * script if you give us credit in the asset description. * * If you intend to use this in a Unity store asset, it would be appreciated, but * not required, if you let us know with a link to the asset. */ using System; using System.Collections.Generic; using UnityEngine; public static class NormalSolver { /// <summary> /// Recalculate the normals of a mesh based on an angle threshold. This takes /// into account distinct vertices that have the same position. /// </summary> /// <param name="mesh"></param> /// <param name="angle"> /// The smoothing angle. Note that triangles that already share /// the same vertex will be smooth regardless of the angle! /// </param> public static void RecalculateNormals(this Mesh mesh, float angle) { var triangles = mesh.GetTriangles(0); var vertices = mesh.vertices; var triNormals = new Vector3[triangles.Length / 3]; //Holds the normal of each triangle var normals = new Vector3[vertices.Length]; angle = angle * Mathf.Deg2Rad; var dictionary = new Dictionary<VertexKey, VertexEntry>(vertices.Length); //Goes through all the triangles and gathers up data to be used later for (var i = 0; i < triangles.Length; i += 3) { int i1 = triangles[i]; int i2 = triangles[i + 1]; int i3 = triangles[i + 2]; //Calculate the normal of the triangle Vector3 p1 = vertices[i2] - vertices[i1]; Vector3 p2 = vertices[i3] - vertices[i1]; Vector3 normal = Vector3.Cross(p1, p2).normalized; int triIndex = i / 3; triNormals[triIndex] = normal; VertexEntry entry; VertexKey key; //For each of the three points of the triangle // > Add this triangle as part of the triangles they're connected to. if (!dictionary.TryGetValue(key = new VertexKey(vertices[i1]), out entry)) { entry = new VertexEntry(); dictionary.Add(key, entry); } entry.Add(i1, triIndex); if (!dictionary.TryGetValue(key = new VertexKey(vertices[i2]), out entry)) { entry = new VertexEntry(); dictionary.Add(key, entry); } entry.Add(i2, triIndex); if (!dictionary.TryGetValue(key = new VertexKey(vertices[i3]), out entry)) { entry = new VertexEntry(); dictionary.Add(key, entry); } entry.Add(i3, triIndex); } //Foreach point in space (not necessarily the same vertex index!) //{ // Foreach triangle T1 that point belongs to // { // Foreach other triangle T2 (including self) that point belongs to and that // meets any of the following: // 1) The corresponding vertex is actually the same vertex // 2) The angle between the two triangles is less than the smoothing angle // { // > Add to temporary Vector3 // } // > Normalize temporary Vector3 to find the average // > Assign the normal to corresponding vertex of T1 // } //} foreach (var value in dictionary.Values) { for (var i = 0; i < value.Count; ++i) { var sum = new Vector3(); for (var j = 0; j < value.Count; ++j) { if (value.VertexIndex[i] == value.VertexIndex[j]) { sum += triNormals[value.TriangleIndex[j]]; } else { float dot = Vector3.Dot( triNormals[value.TriangleIndex[i]], triNormals[value.TriangleIndex[j]]); dot = Mathf.Clamp(dot, -0.99999f, 0.99999f); float acos = Mathf.Acos(dot); if (acos <= angle) { sum += triNormals[value.TriangleIndex[j]]; } } } normals[value.VertexIndex[i]] = sum.normalized; } } mesh.normals = normals; } private struct VertexKey { private readonly long _x; private readonly long _y; private readonly long _z; //Change this if you require a different precision. private const int Tolerance = 100000; public VertexKey(Vector3 position) { _x = (long)(Mathf.Round(position.x * Tolerance)); _y = (long)(Mathf.Round(position.y * Tolerance)); _z = (long)(Mathf.Round(position.z * Tolerance)); } public override bool Equals(object obj) { var key = (VertexKey)obj; return _x == key._x && _y == key._y && _z == key._z; } public override int GetHashCode() { return (_x * 7 ^ _y * 13 ^ _z * 27).GetHashCode(); } } private sealed class VertexEntry { public int[] TriangleIndex = new int[4]; public int[] VertexIndex = new int[4]; private int _reserved = 4; private int _count; public int Count { get { return _count; } } public void Add(int vertIndex, int triIndex) { //Auto-resize the arrays when needed if (_reserved == _count) { _reserved *= 2; Array.Resize(ref TriangleIndex, _reserved); Array.Resize(ref VertexIndex, _reserved); } TriangleIndex[_count] = triIndex; VertexIndex[_count] = vertIndex; ++_count; } } } |

Hi Charis, really good post, thanks for the explanation and code.

I am seeing incorrect shading on my dynamic mesh and I have tried this method to resolve my issue but it doesn’t seem to have fixed it. For each quad (two triangles) my lighting doesn’t seem to blend in with its neighbours:

http://www.spannerworx.com/images/seams.png

I am using just one White directional light and the default diffuse material.

Do you have any ideas what I could try?

Thanks

Hello, I’m sorry for the late reply.

Have you fixed your issue yet? What do you get when you call the default RecalculateNormals() method? If you’re talking about the seam, that might be caused because you have two meshes and not one. This code will only work for one mesh only.

If you’re not talking about the seam but the smoothness of your triangles, that actually seems correct. The triangles ARE smooth, it’s just that they’re very big and it’s what you’d get with a default shader. Try smooth-subdividing your dynamic mesh to increase poly-count. I don’t think Unity has built-in methods to do that, but you might be able to find an algorithm for that online.

Excellent post, thank you so much for explanation and implementation.

Nice, thank you for this !

Fantastic! Amazing! Thank you so much.

Oh man, thank a lot, fantastic job!

Hey there, excuse my ignorance but I cant get this to work… when i add the script RecalculateNormals still shows the summary for the Unity version in mono- and if I add a variable i.e. RecalculateNormals(60) I get an error about overloaded methods…- do I have to call it some other way?

Ok I think I got it actually- I think the script needs to go in the ‘Standard Assets’ folder…

So this is great- leagues ahead of the standard function, and very useful in lots of scenarios, even if its slower (like when you are allowing character customization for example).

One question though, in the Unity manual for RecalculateNormals (http://docs.unity3d.com/ScriptReference/Mesh.RecalculateNormals.html) it says ‘Also note that RecalculateNormals does not generate tangents automatically thus bumpmap shaders will not work’ – again excuse my ignorance of the jargon here, but does that mean that if I have a material that uses the ‘Standard Shader’ and I have a normal map in there, that after you call this, the normal map wont have any effect?

Also I really think that UMA (an asset that allows you to chreate characters from mixed meshes at runtime and customise them -see here http://forum.unity3d.com/threads/uma-unity-multipurpose-avatar-on-the-asset-store.219175 and on the asset store) could do with some help with the sort of thing you are into here. I have noticed that if you adjust a UMA character (which they do using bone deformation) then the lighting is wrong. I have been trying to use your script to recalculate the normals after the deformation, but not done so successfully yet… But the UMA project is very focused on creating multi use Armour that characters can wear, and obviously for that having shiny plate armour that reflects light properly would make it look waaaay cooler.

pop over if you have a spare minute 🙂

Hi, I thought I replied, but apparently I didn’t.

This method does not mess with tangents, so if your model might still work if it already has tangents. But, If you combine two meshes together, you might be able to also take their tangents and merge them as well, although I’m not sure what will happen at the seams.

In any case, you can always recalculate tangents as well, and there is code for that online. I’ll investigate when I have more time and maybe write some other tangent/bitangent method that can work with this one.

it works great! Thanks man!!

It’s possible to apply this thing to the Skinned Mesh? I tried to did it but seems it doesn’t work. After recalculation my mesh just becomes black. /:

I’ve never tried it with a Skinned Mesh before. If you get a black shape it probably means that the normals are somehow reset to {0, 0, 0}, but this shouldn’t be happening. Did you try other Skinned Meshes?

I’m having the same issue. SkinnedMeshRenderer comes out black.

Yes, the Skinned Mesh is always black for me. ):

I’ll investigate this when I have more time. It’s possible that it just needs to be updated for the latest versions of Unity, as I haven’t done anything 3D (in Unity) since 4.6.

sorry for the nube question here, but I have some models I’m trying to setup for webgl, and I am having normal smoothing issues that seem match what you are talking about here. However, being a nube, I don’t know how to implement your script to fix the meshes that have smoothing issues at runtime. Can you just give me a quick step by step on how to use the scripts you have provided?

Thanks!

Thank You so much for this article, it was a perfect read in every way. You have answered even questions I did expect to find answers for. Keep up the good work.

Definitely don’t work for me and my Dual Contouring…