You seem to have gone wrong right off the bat in computing the projection matrix. The kernel and image that you’ve come up with for it are correct, but that matrix doesn’t represent orthogonal projection onto the given plane, nor, for that matter, any projection whatsoever. The matrix obviously has rank 1, but since the image of the projection must be the given plane, it should have rank 2. As well, one of the defining properties of a projection $P$ is that $P^2=P$, but that doesn’t hold for the matrix in your question.
It looks like you might’ve computed the projection onto the plane’s normal and then scaled incorrectly: the common denominator $196$ is equal to $14^2$, i.e., the square of the square of the length of the normal that can be extracted from the plane’s equation. Working out the projection onto the normal (correctly) is a good start to finding the matrix that projects onto the plane, but you then have to subtract this projection from the original vector, i.e., subtract the resulting matrix from the identity.
That said, there’s no need to construct any matrix whatsoever to solve this problem. By definition, the image of a projection onto a plane is the plane itself. The kernel or any orthogonal projection is the orthogonal complement of the image, which in this case is the set of vectors normal to the plane.
You are right: $(1,1,1)$ is orthogonal to $V$. Therefore, $A.(1,1,1)=(0,0,0)$. Now, consider the vectors $(1,-1,0)$ and $(1,0,-1)$. Since they both belong to $V$, you must have $A.(1,-1,0)=(1,-1,0)$ and $A.(1,0,-1)=(1,0,-1)$.
Now, since$$(1,0,0)=\frac13(1,1,1)+\frac13(1,-1,0)+\frac13(1,0,-1),$$you must have$$A.(1,0,0)=\frac13(1,-1,0)+\frac13(1,0,-1)=\left(\frac23,-\frac13,-\frac13\right).$$So, the entries of the first column of the matrix of $\operatorname{proj}_V$ with respect to the standard basis will be $\frac23$, $-\frac13$ and $-\frac13$. Can you take it from here?
Best Answer
Let's restrict our attention to subspaces $V$ of $\mathbb{R}^3$ rather than $\mathbb{R}^n$. Once this case is understood, you can try to generalize it. It is important to think slowly from the definitions. Geometric intuition will come afterwards (and be correct). I will not recall the definition of orthogonal projection onto a subspace for you, you can look that up in your notes/textbook.
You appear to be confusing several concepts. Let me try to clarify them for you.
Fix a subspace $V \subseteq \mathbb{R}^3$ (this could be the origin, a line through the origin, a plane containing the origin, or the entire space $\mathbb{R}^3$). Let $T_V\colon \mathbb{R}^3 \rightarrow \mathbb{R}^3$ be the linear transformation defined by orthogonal projection onto the subspace $V$.
Any linear transformation has a kernel and an image. They are defined for $T_V$ as follows:
$$\text{image}(T_V) = \left\{ y \in \mathbb{R}^3 \colon \exists x \in \mathbb{R}^3 \text{ such that } T_V(x) = y \right\} $$
$$\text{kernel}(T_V) = \left\{ x \in \mathbb{R}^3 \colon T_V(x) = 0 \right\}$$
(you may note that both the image and the kernel of $T_V$ are subspaces of $\mathbb{R}^3$).
From the first definition, we can explain that $$\text{image}(T_V) = V.$$ The proof uses two key facts: the definition of the image of a linear transformation, and the definition of the map $T_V$.
Proof that $\text{image}(T_V) = V$: In order to do this, we show that $\text{image}(T_V) \subseteq V$ and $V \subseteq \text{image}(T_V)$:
For any vector $x \in \mathbb{R}^3$, the orthogonal projection of $x$ onto $V$ is an element of $V$. Thus $\text{image}(T_V) \subseteq V$.
On the other hand, if $x$ is an element of $V$, then $T_V(x) = x$ (the orthogonal projection of a vector in $V$ onto $V$ is itself), so $V\subseteq \text{image}(T_V)$. This completes the proof. $\square$
Thus:
Now we would like to describe the second space, $\text{kernel}(T_V)$. In order to do this, it is useful to recall that the orthogonal complement of a subspace $V$ is a new subspace defined in the following way: $$V^{\perp} = \left\{ y\in \mathbb{R}^3 : \forall x\in V, \langle x,y\rangle = 0 \right\}.$$ In plain English, $V^{\perp}$ is the set of all vectors that are orthogonal to every vector in $V$.
You should think about why the following statements are true (note that tehy only make sense if $V$ is a subspace of $\mathbb{R}^3$):
You should also try to draw pictures of some examples.
The following statement contains the intuition you are after. I will leave the proof of this to you.
$$V^{\perp} = \text{kernel}(T_V).$$