Knowing the set of orthogonal pairs of vectors fixes an inner product up to a constant positive factor. It is clear that such a factor doesn't change the concept of orthogonality. Conversely, we can prove that two inner products with the same set of orthogonal pairs must be related through a constant factor:
Given two inner products $\langle -,-\rangle$ and $[-,-]$ such that they agree about which vectors are orthogonal. Let $v_1,\ldots,v_n$ be an orthonormal basis with respect to $\langle -,-\rangle$. By assumption $[-,-]$ agrees that the $v_i$s are mutually orthogonal, so knowing the value of $[v_i,v_i]$ for each $i$ will fix all of $[-,-]$ by linearity.
Now for $i\ne j$ we have
$$\langle v_i+v_j, v_i-v_j\rangle = \langle v_i,v_i\rangle - \langle v_j,v_j\rangle = 1 - 1 = 0 $$
and therefore it must also hold that $[v_i,v_i]-[v_j,v_j] = [v_i+v_j, v_i-v_j] = 0$. Since $i$ and $j$ were arbitrary, all the $[v_i,v_i]$s must be the same, so $[v,w]=a\langle v,w\rangle$ for all $v$, $w$, where $a=[v_1,v_1]$.
If the vector space is infinite-dimensional (such that there is not necessarily any orthogonal Hamel basis for all of it), this argument can be repeated for each finite-dimensional subspace to reach the same conclusion.
As for the second question, the Gram--Schmidt process shows that every inner product on $\mathbb R^n$ is the same, modulo changes of basis. But in situations where we have a preferred basis imposed on us externally it certainly makes sense to consider different inner products. The most intuitively vivid examples come from differential geometry where "custom" inner products are used to connect arbitrary not-necessarily-rectilinear coordinate systems with geometric reality. For example, consider geographic coordinates on the Earth. If we have two lines on the map (of a not too large area) given by coordinates in degrees, then it makes sense to ask for the angle between them. We can get that using an inner product -- but because a degree of longitude is shorter than a degree of latitude (except at the equator) this needs to be a non-standard inner product in order to give geometrically meaningful results.
Usually the term "orthogonal matrix" is reserved for matrices whose columns are not only mutually perpendicular, but also unit vectors. So if you were to divide each entry in your matrix above by $2$, it would be an orthogonal matrix.
There appears to be no standard term for a matrix whose columns are just orthogonal without any restriction on their norms.
Best Answer
I think what is confusing you is that the adjectives "orthogonal" and "perpendicular" are synonyms in ordinary English. In mathematics their precise meanings depend on the context.
For nonzero vectors they mean the same thing.
Ab orthogonal matrix is a matrix whose columns (which are vectors) are orthogonal to each other (and each individual column has length $1$).
PS. The product of two matrices is not their "inner product".