In optimization, it is common to see the so called $\operatorname{diag}$ function
Given a vector $x \in \mathbb{R}^n$, $\operatorname{diag}(x)$ = $n \times n$ diagonal matrix with components of $x$ on the diagonal
For example:
Optimization that involves inverse operation.
The reason of using $\operatorname{diag}$ is because it is used in several platforms such as MATLAB, and people generally understands what the function is supposed to do
Is there a more linear algebra, step by step way of converting a
vector $x \in \mathbb{R}^n$ into a diagonal matrix with components on
the diagonal without having a define a function that directly performs
the task ?
i.e. given $x$, we find a series of functions/steps $f_2 \circ f_1 (x)$ which give us the same matrix as $\operatorname{diag}(x)$
Best Answer
There is a closed form.
$\mbox{diag}(x)=I_n\circ (xu)$ where $\circ$ is the Hadamard product, $I_n$ the identity matrix and $u=[1,\cdots,1]$.