My answer from the MO thread:
A matrix is just a list of numbers, and you're allowed to add and multiply matrices by combining those numbers in a certain way. When you talk about matrices, you're allowed to talk about things like the entry in the 3rd row and 4th column, and so forth. In this setting, matrices are useful for representing things like transition probabilities in a Markov chain, where each entry indicates the probability of transitioning from one state to another. You can do lots of interesting numerical things with matrices, and these interesting numerical things are very important because matrices show up a lot in engineering and the sciences.
In linear algebra, however, you instead talk about linear transformations, which are not (I cannot emphasize this enough) a list of numbers, although sometimes it is convenient to use a particular matrix to write down a linear transformation. The difference between a linear transformation and a matrix is not easy to grasp the first time you see it, and most people would be fine with conflating the two points of view. However, when you're given a linear transformation, you're not allowed to ask for things like the entry in its 3rd row and 4th column because questions like these depend on a choice of basis. Instead, you're only allowed to ask for things that don't depend on the basis, such as the rank, the trace, the determinant, or the set of eigenvalues. This point of view may seem unnecessarily restrictive, but it is fundamental to a deeper understanding of pure mathematics.
Let $\|u\|=1$ and $Q=I-2uu^\top$.
Of course, the first condition above says that $u^\top u=1$. Secondly, the meaning of $v$ is perpendicular to $u$" is that $v^\top u= u^\top v=0$.
compute Qu and simplify as much as possible. Does this just mean move the equation around to get Qu?
No, $Qu$ is a matrix product. Starting with the above, you have $Qu=(I-2uu^\top)u=Iu-2uu^\top u$. What does this reduce to?
Suppose v is orthogonal to u. Compute Qv. Im not sure how this is done.
Again, it's just a matrix product. $Qv=(I-2uu^\top)v=Iv-2uu^\top v$. What does this reduce to?
Then I'm asked to explain in plain English which subspace $Q$ is reflecting across.
This is an interesting question which is not as mechanical as the rest of the problem. For one of the two parts above, you'll discover that $Qx=-x$. Geometrically, this means that $Q$ just reversed the direction of that vector.
One of the other computations is going to come out to $Qy=y$, meaning that $Q$ didn't alter the vector at all! But remember that the above had you assume that $x$ and $y$ are perpendicular to each other, so this means that one direction was reversed, and all perpendicular directions were left alone. If you fix a vector $x$, what does the collection of all perpendicular vectors look like?
Compute the reflection matrix $Q_1=I−2u_1u^\top_1$ where $u_1=(0,1)$. Compute $Q_1x_1$, where $x_1=(0,1)$ and sketch the vectors $u_1, x_1$ and $Q_1x_1$ in the plane.
This is one is easy to start because it begins with "do this computation." What part is holding you back? Putting the givens into the equations? The matrix addition and multiplication?
Best Answer
The way I think of a difference matrix is to first start with the identity matrix:
$$\begin{bmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1 \end{bmatrix}$$
Multiplied against the vector $x$ will return $x$.
$$\begin{bmatrix} x_{1}\\ x_{2}\\ x_{3} \end{bmatrix}$$
But if we want to end up with each component of $x$ being the difference between itself and the preceding component, we want to end up with something like this:
$$\begin{bmatrix} x_{1}\\ x_{2} - x_{1}\\ x_{3} - x_{2} \end{bmatrix}$$
We can do that easily by modifying the identity matrix so that it picks up the previous $x_{i}$ as a negative. Here's the modified identity matrix. Notice the positions of the $-1$.
$$\begin{bmatrix} 1 & 0 & 0\\ -1 & 1 & 0\\ 0 & -1 & 1 \end{bmatrix}$$
That is your difference matrix.
If you wanted a centered difference matrix, we want to end up with:
$$\begin{bmatrix} x_{2}\\ x_{3} - x_{1}\\ -x_{2} \end{bmatrix}$$
Notice we returned to $x_{2}$ in the bottom line but as a negative. That's just because for the last row, we're taking "nothing" minus $x_{2}$ as there are no more components left in the matrix, just as we took $x_{3} - x_{1}$ above it. Another way to look at this:
$$\begin{bmatrix} x_{2} - 0\\ x_{3} - x_{1}\\ 0 - x_{2} \end{bmatrix}$$
Now let's look at our identity matrix. Before we start plugging in $-1$s, let's realize we need to offset the components such that we start with $x_{2}$ not $x_{1}$ as we did with the difference matrix.
$$\begin{bmatrix} 0 & 1 & 0\\ 0 & 0 & 1\\ 0 & 0 & 0 \end{bmatrix}$$
Now for the $-1$s, we need to skip a term, so you'll see $0$ gaps between the 1s and -1s.
$$\begin{bmatrix} 0 & 1 & 0\\ -1 & 0 & 1\\ 0 & -1 & 0 \end{bmatrix}$$
And there you have your centered difference matrix.