Vectors are often written column-wise as if they were $n\times 1$ matrices:
$$
\mathbf{v} := \begin{bmatrix}
1 \\ 2 \\ 3
\end{bmatrix}
$$
This notation implicitly identifies the vector $\mathbf{v}\in \mathbf{R}^3$ with its equivalent matrix, which represents a linear operator
$$
v: \mathbf{R}^1 \to \mathbf{R}^3 \\
v(t) := \begin{bmatrix}
t \\ 2t \\ 3t
\end{bmatrix}
$$
Thus, identifying a real scalar $\lambda\in\mathbf{R}$ with its corresponding $1$-vector, it would seem to make sense that scalar multiplication of this vector with real scalar $\lambda\in\mathbf{R}$ be written as
$$
\mathbf{v}\lambda
$$
to match the usual notation for matrix-vector multiplication, where the operator is written on the left. However, it is more common to see
$$
\lambda \mathbf{v}
$$
where the expression cannot be read as a covector-matrix multiplication, because the $1\times 1$ dimension of the scalar $\lambda$ is apparently incompatible with the $3\times 1$ matrix $\mathbf{v}$. Why is this second notation, with the scalar on the left, more common?
Best Answer
We do not want to pigeonhole ourselves into thinking of scalars $\lambda$ as their corresponding $1\times 1$ matrices $[\lambda]$. It is always legal to scale any $m\times n$ matrix $A$ by $\lambda$, but not always legal to multiply $A$ by $[\lambda]$ on either side. It is better to think of scalars as being distinct from vectors and matrices.
I do not have a good explanation why $\lambda v$ is more common that $v\lambda$, but it should not stem from thinking of scalars as a $1\times 1$ matrix. Perhaps it is related to the convention of putting the coefficient before the monomial when writing polynomials, e.g. $5x^2$. (I see now that Rob Arthan already made this observation in a comment).