Suppose there are $m (m\geq 4)$ planes, for plane number $k$ we know a point $\mathbf{x}_k = (x_{k,1},x_{k,2},x_{k,3})$ on the plane, and the unit normal vector for plane number $k$ is $\mathbf{n}_k$.
Here I am not sure whether you mean by "a point for each plane", or "a common point for all planes", both are discussed.
If we assume that all planes pass through point $\mathbf{x}_0$. Then this line can be represented using this point $\mathbf{x}_0$ and a direction $\mathbf{b} = (b_1,b_2,b_3)$
$$
\mathbf{x} = \mathbf{x}_0 + t\mathbf{b} \quad\text{ for } t\in \mathbb{R}
$$
Then the problem does not need least-square treatment at all, as we are not solving an overdetermined system, $\mathbf{b}$ can be obtained simply by cross product any two normals:
$$
\mathbf{b} = \mathbf{n}_1\times\mathbf{n}_2, \tag{1}
$$
where $\times$ is the cross product of vectors. Because the normal vectors of all the planes are coplanar (thus the planes intersect at exactly one line), $\mathbf{b}\perp \mathbf{n}_1, \mathbf{n}_2,\ldots, \mathbf{n}_m$. And we are done.
If we only know a point for each plane, and we do not know if they all passing through one common line. Things get more complicated. The least square problem may not be well-posed. We want this line being perpendicular to every plane (as much as possible), and passing through a point that has the minimal distance to all the planes. The equation system for $\mathbf{x} = (x_1,x_2,x_3)$ is:
$$
\mathrm{dist}(\mathbf{x},P_k) = |\mathbf{n}_k \cdot (\mathbf{x} - \mathbf{x}_k )| = 0
$$
written in matrix form
$$
\mathbf{N}\mathbf{x}:= \begin{pmatrix}n_{1,1} &n_{1,2} &n_{1,3} \\
n_{2,1} &n_{2,2} &n_{2,3} \\ \vdots&\vdots&\vdots \\
n_{m,1} &n_{m,2} &n_{m,3}\end{pmatrix}\begin{pmatrix}x_1 \\x_2 \\x_3\end{pmatrix} = \mathbf{y} := \begin{pmatrix}\mathbf{n}_1 \cdot \mathbf{x}_1 \\
\mathbf{n}_2 \cdot \mathbf{x}_2 \\ \vdots \\
\mathbf{n}_m \cdot \mathbf{x}_m\end{pmatrix}\tag{2}
$$
where $\mathbf{N}$ is an $m\times 3$ matrix, which is basically the list of all the normal vectors row by row. The least square formulation for problem (2) is to minimize:
$$
\min_{\mathbf{x}} \|\mathbf{N}\mathbf{x}-\mathbf{y}\|,
$$
where the norm is just Euclidean norm, the solution is the point $\mathbf{x}_0$ satisfying:
$$
\mathbf{N}^{T}\mathbf{N}\mathbf{x}_0 = \mathbf{N}^{T}\mathbf{y}. \tag{$\dagger$}
$$
$(\dagger)$ is the first matrix equation you wanna solve. After we obtain this $\mathbf{x}_0$, we wanna find a $\mathbf{b}$ such that:
Either if all these planes intersecting at 1 line, we can find this $\mathbf{b}$ exactly for this line.
Or if they do not intersect at 1 line, but every two of them intersect at 1 line (no parallel planes, you can rule out parallel planes by checking same normals or normals with opposite signs but same in each component). We use least square to find it.
For the first case, we know that $\mathbf{b}$ satisfies:
$$
\mathbf{b}\cdot \mathbf{n}_j = 0
$$
for every plane number $j$, because the line being on every plane means it is perpendicular to every normal vector $ \mathbf{n}_j $. In this case, simply computing (1) is enough.
For the second case, the least square system produced by $\mathbf{b}\cdot \mathbf{n}_j = 0$ is:
$$
\min_{\mathbf{b}} \|\mathbf{N}\mathbf{b}\|,
$$
and the solution is zero if these planes do not intersect. One possible way is simply compute the mean of $\mathbf{n}_i\times\mathbf{n}_j$, which is the intersecting line vector for every two of the planes, and set this mean as $\mathbf{b}$. Another way(essentially the same) is more least square'ish is try to solve the minimization:
$$
\min_{\mathbf{b}} \sum_{i,j}\|\mathbf{b} - \mathbf{n}_i\times\mathbf{n}_j\|^2,
$$
so that the resulting $\mathbf{b}$ is the closest line parallel to all intersecting line vector of every two of the planes in least square sense. In this minimization, you need to be careful with the normal direction because of $\mathbf{n}_i\times\mathbf{n}_j = -\mathbf{n}_j\times\mathbf{n}_i$, you wanna make sure all the intersecting line vector be about the same direction, i.e., $(\mathbf{n}_i\times\mathbf{n}_j)\cdot (\mathbf{n}_k\times\mathbf{n}_l) \geq 0$. The final minimizer is the mean of $\mathbf{n}_i\times\mathbf{n}_j$, for the case of four planes:
$$
\begin{aligned}
\mathbf{b} =& (\mathbf{n}_1\times\mathbf{n}_2 + \mathbf{n}_1\times\mathbf{n}_3 + \mathbf{n}_1\times\mathbf{n}_4
\\
&+\mathbf{n}_2\times\mathbf{n}_3 + \mathbf{n}_2\times\mathbf{n}_4
\\
&+ \mathbf{n}_3\times\mathbf{n}_4)/6,
\end{aligned}
$$
if we assume looking from the intersecting line perspective, the direction pointed by $\mathbf{n}_1$, $\mathbf{n}_2$, $\mathbf{n}_3$, to $\mathbf{n}_4$ is rotating counterclockwisely. For example in your picture you wanna make the fourth normal vector pointing the opposite direction.
For even larger data set, please refer to: http://en.wikipedia.org/wiki/Principal_component_analysis
If you write your systems of equations as a matrix as follows:
$$A \vec{x} = \begin{bmatrix} 1 & -3 & 2 \\ 1 & 3 & -2 \\ 0 & -6 & 4 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} -2 \\ 5 \\ 3\end{bmatrix} = \vec{b}$$
then here is a (perhaps) quicker way to determine if the picture looks like the triangle. Note: I don't know how comfortable you are with basic linear algebra concepts, but you only need them to understand the proof of why this is correct. You can apply the method without any understanding of them.
$1$. If all three normal vectors of the planes are multiples of the same vector, then you can immediately conclude you have three parallel
planes (and not the triangle).
$2$. If exactly two normal vectors are multiples of the same vector, then you can immediately conclude you don't have the triangle.
Instead, you have one plane that is cut by two parallel planes.
$3$. If none of the normal vectors are multiples of each other, then it's possible you have the triangle. As you noted, the normal vectors
must be in the same plane, i.e. linearly dependent, so it must follow
that $\det(A) = 0$. If this isn't the case, then you can immediately
conclude that the planes intersect in one point.
$4$. If there is a solution, then $\vec{b}$ should be a linear combination of two linearly independent columns of $A$. (This is because $A \vec{x}$ is just a linear combination of $A$'s columns. If there is a
solution to $A \vec{x} = \vec{b}$ and $A$ has two linearly independent
columns, then $\vec{b}$ should be able to be written as a linear
combination of just those two columns.) Thus, if we replace a linearly
dependent column (i.e. one that can be expressed as a linear
combination of the others) of $A$ with the vector $\vec{b}$ to create
the matrix $A'$, for there to be no solution (i.e. the "triangle"
configuration) it must be the case that $\det(A') \neq 0$. If
$\det(A') = 0$, then you can conclude you have three planes
intersecting in one line (the second picture you've posted).
Fortunately, choosing a linearly dependent column is easy. You
just need to make sure to a) replace a zero column with $\vec{b}$ if
$A$ has a zero column or b) if there are two columns that are (nonzero)
multiples of each other, then replace one of them with $\vec{b}$. And
if none of a) or b) is the case, then you can choose any column.
Example: I'll work thru the steps above with the example you've written.
Steps $1$ and $2$. I can immediately notice that none of normal vectors of the planes are parallel. So we proceed to step $3$.
Step $3$. We can calculate
$$\det(A) = (1)(12 - 12) - (-3)(4 - 0) + 2(-6 - 0) = 0$$
so we proceed to step $4$. Note that if you were able to observe that the third row of $A$ was a linear combination of the first and second row (the third row is simply the first row minus the second row) or that the third column was a multiple of the second column, you could immediately skip to step $4$.
Step $4$. We can notice that none of the columns are zeroes (case a), but in fact the last two columns are multiples of each other. So case b) applies here, and we have to exchange one of the last two columns with $\vec{b}$ for the process to be correct. Let's replace the last column of $A$ with $\vec{b}$ to obtain $A'$:
$$A' = \begin{bmatrix} 1 & -3 & -2 \\ 1 & 3 & 5 \\ 0 & -6 & 3 \end{bmatrix}$$
and we can calculate
$$\det (A') = (1)(9 + 30) - (-3)(3 - 0) + (-2)(-6 - 0) = 29 + 9 + 12 = 60 \neq 0$$
and hence we can conclude we have the "triangle" configuration.
Conclusion: I think this method is somewhat easier than calculating the three intersection lines. It requires you to calculate two determinants of $3 \times 3$ matrices instead.
Best Answer
The general equation of a plane in 3-D is given by $$\mathbf{(p-p_0).n}=0$$ where $\mathbf{p}$ is any general point on the plane, and $\mathbf{p_0}$ is any known point on the plane. $\mathbf{n}$ is a vector normal to the plane.
The equation of a line is given by $$\mathbf{p = l_0+}t\mathbf{l} $$ where $\mathbf{l_0}$ is any point on the line.
If the line lies in the plane, it must satisfy two conditions-
It must be perpendicular to the normal to the plane i.e. $\mathbf{l.n}=0$
$\mathbf{l_0}$ must lie in the plane i.e. satisfy the plane's equation. So, $\mathbf{(l_0-p_0).n} = 0$
You can calculate $\mathbf{p_0,l_0}$ very easily with the information you have. $\mathbf{p_0}$ can be calculated by choosing any two of $x,y,z$ and finding the third to satisfy the equation of the plane. $\mathbf{l_0}$ is precisely $\vec O$ that you already know. You can use them to cross-check whether the fit is good or not. But they do not depend on $\mathbf{l}$. So, they are secondary, and may be used as a sanity check later on.
Suppose you are given the equation of the line as $ax+by+cx+d = 0$. You can recast it as $[(x,y,z)-\mathbf{p_0}].(a,b,c)$. So, $(a,b,c)$ is your normal vector.
Suppose you have $m$ planes with normals $\mathbf{n_1,n_2,\ldots,n_m}$. Then, your overall constraints are $$\mathbf{l.n_1} = 0 \\ \mathbf{l.n_2} = 0 \\ \ldots \\\mathbf{l.n_m}=0$$.
It is a system of linear equations with 3 variables and $m$ equations. You need to find out $\mathbf{l}$, which can be found up to a constant factor by the standard least squares method, provided $m>3$, which should not be a worry for you. If you code carefully enough, you can implement all of it in matrix terms.
Hope it helps.