[Math] How to calculate the direction in which a set of normal vectors (3D) are least oriented

3dgeometry

I have an STL file with thousands of triangular planes with different orientations. What is the best way to calculate the direction in which the normal vectors of the triangles are least aligned?

I calculated the ‘average orientation of the normal vectors’ in MATLAB by taking the average of the absolute value of each component (Is it acceptable to take the absolute value?):

xcomp = sum(abs(normal(1,1:numberoftriangles)))/numberoftriangles;
ycomp = sum(abs(normal(2,1:numberoftriangles)))/numberoftriangles;
zcomp = sum(abs(normal(3,1:numberoftriangles)))/numberoftriangles;

And I normalise the length of this ‘average vector’:

wronglength = sqrt(xcomp^2+ycomp^2+zcomp^2);
xcomp = xcomp/wronglength;
ycomp = ycomp/wronglength;
zcomp = zcomp/wronglength;

Can I then just calculate the direction in which the normal vectors of the triangles are least aligned as:

x = 1-xcomp
y = 1-ycomp
z = 1-zcomp

and normalise again?

I am looking for a more accurate algorithm.

Best Answer

Taking the absolute values is probably not a good idea, and the $x\to1-x$ transform at the end also doesn't look promising.

I'll assume that by "least aligned" you mean that ideally your vector would be orthogonal to all the normals (and thus in the planes of all the triangles), not that it should be opposite to the normals (and thus pointing into the planes); in the latter case you should just add up all the normals and use the direction opposite to the sum.

A first attempt might be to add up the normals and then take a vector orthogonal to the sum, but then you'd have to arbitrarily choose one of the vectors in the plane orthogonal to the sum, and any information which of these might be best would be lost.

Here's a systematic way of doing this. You want your vector $x$ to be orthogonal to the normals $n_i$, so you want the absolute value of the dot product $x\cdot n_i$ to be as small as possible. A natural ansatz would then be to minimize the sum of the squares of the dot products:

$$ \sum_i(x\cdot n_i)^2\to\min\;, $$

subject to the constraint that $x$ is a unit vector. We can write this more suggestively:

$$ \sum_ix^\top n_in_i^\top x=x^\top\left(\sum_i n_in_i^\top \right)x\to\min\;. $$

Adding a Lagrange multiplier term for the normalization constraint yields the objective function

$$ x^\top\left(\sum_i n_in_i^\top \right)x-\lambda x^\top x\;, $$

and requiring the gradient with respect to $x$ to vanish yields

$$ \left(\sum_i n_in_i^\top \right)x-\lambda x=0\;. $$

This is the eigenvalue problem for the positive semidefinite matrix $\sum_in_in_i^\top$. That's just a $3\times3$ matrix, so you can readily find its eigensystem. The direction you want is the eigenvector corresponding to its least eigenvalue.