If you want to find the proximal operator of $\|x\|_{\infty}$, you don't want to compute the subgradient directly. Rather, as the previous answer mentioned, we can use Moreau decomposition:
$$ v = \textrm{prox}_{f}(v) + \textrm{prox}_{f^*}(v)$$
where $f^*$ is the convex conjugate, given by:
$$ f^*(x) = \underset{y}{\sup}\;(x^Ty - f(y))$$
In the case of norms, the convex conjugate is an indicator function based on the dual norm, i.e. if $f(x) = \|x\|_p$, for $p \geq 1$, then $f^*(x) = 1_{\{\|x\|_q \leq 1\}}(x)$, where $1/p + 1/q = 1$, and the indicator function is:
\begin{equation}
1_S(x)=\begin{cases}
0, & \text{if $x \in S$}.\\
\infty, & \text{if $x \notin S$}.
\end{cases}
\end{equation}
For your particular question, $f(x) = \|x\|_{\infty}$, so $f^*(x) = 1_{\{\|x\|_1\leq 1\}}(x)$.
We know
$$\textrm{prox}_{f}(x) = x - \textrm{prox}_{f^*}(x)$$
Thus we need to find
$$\textrm{prox}_{f^*}(x) = \underset{z}{\arg\min} \; \left(1_{\{\|z\|_1 \leq 1\}} + \|z - x\|_2^2 \right)$$
But this is simply projection onto the $L_1$ ball, thus the prox of the infinity norm is given by:
$$ \textrm{prox}_{\|\cdot\|_{\infty}}(x) = x - \textrm{Proj}_{\{\|\cdot\|_1 \leq 1\}}(x)$$
The best reference for this is Neal Parikh, Stephen Boyd - Proximal Algorithms.
I'll attempt to explain the intuition here.
There may be many affine minorants of $h$ with a given slope $y$, but we only care about the best one:
\begin{align}
&h(x) \geq \langle y , x \rangle - \alpha \quad \text{for all } x \\
\iff & \alpha \geq \langle y, x \rangle - h(x) \quad \text{for all } x \\
\iff & \alpha \geq \sup_x \, \langle y, x \rangle - h(x) \\
\iff & \alpha \geq h^*(y).
\end{align}
Thus, the best choice of $\alpha$ is $h^*(y)$.
(If there is no affine minorant of $h$ with slope $y$, then $h^*(y) = \infty$.)
Suppose that
\begin{equation}
v \in \partial h(u).
\end{equation}
This means: there exists some affine minorant of $h$ with slope $v$ which is exact at $u$.
Of all affine minorants of $h$ with slope $v$, the best one (the closest one) is $a(x) = \langle v, x \rangle - h^*(v)$.
Since $a$ is the best affine minorant of $h$ with slope $v$, and since some affine minorant with slope $v$ is exact at $u$, it follows that $a$ is exact at $u$:
\begin{equation}
h(u) = \langle v, u \rangle - h^*(v)
\end{equation}
Otherwise $a$ would not be the best.
Hence
\begin{align}
h^*(v) &= \langle u,v \rangle - h(u) \\
&= \langle u, v \rangle - h^{**}(u)
\end{align}
and we know that $\langle u, v \rangle - h^{**}(u)$ is an affine minorant of $h^*$.
Thus we have found an affine minorant of $h^*$ with slope $u$ which is exact at $v$. This means that
\begin{equation}
u \in \partial h^*(v).
\end{equation}
In summary, note the beautiful symmetry that allowed our key step:
\begin{equation}
h(u) = \langle v, u \rangle - h^*(v) \qquad \text{ " $v$ is a subgradient of $h$ "}
\end{equation}
becomes
\begin{equation}
h^*(v) = \langle u, v \rangle - h(u) \qquad \text{ " $u$ is a subgradient of $h^*$ "}.
\end{equation}
Best Answer
Given $ f \left( x \right) = \left\| x \right\| $ is a norm function its Prox is given by (For any Norm):
$$ \operatorname{Prox}_{\lambda f \left( \cdot \right)} \left( v \right) = v - \lambda \operatorname{Proj}_{ \mathcal{B}_{ \left\| \cdot \right\|_{\ast} } } \left( \frac{v}{\lambda} \right) $$
Where $ \operatorname{Proj}_{ \mathcal{B}_{ \left\| \cdot \right\|_{\ast} } } \left( \cdot \right) $ is the Orthogonal Projection Operator and $ \mathcal{B}_{ \left\| \cdot \right\|_{\ast} } $ is the Norm Unit Ball (Of the Dual Norm).
In your case we're dealing with the $ {L}_{2} $ Norm which is self dual.
Moreover, the Projection onto the $ {L}_{2} $ Unit Ball is given by:
$$ \operatorname{Proj}_{ \mathcal{B}_{ \left\| \cdot \right\|_{2} } } \left( x \right) = \begin{cases} \frac{x}{ \left\| x \right\|_{2} } & \text{ if } \left\| x \right\|_{2} > 1 \\ x & \text{ if } \left\| x \right\|_{2} \leq 1 \end{cases} $$
In summary:
$$ \operatorname{Prox}_{\lambda \left\| \cdot \right\|_{2}} \left( v \right) = v - \lambda \operatorname{Proj}_{ \mathcal{B}_{ \left\| \cdot \right\|_{2} } } \left( \frac{v}{\lambda} \right) = \begin{cases} v - \lambda \frac{ \frac{v}{\lambda} }{ \left\| \frac{v}{\lambda} \right\|_{2} } & \text{ if } \left\| \frac{v}{\lambda} \right\|_{2} > 1 \\ v - \lambda \frac{v}{\lambda} & \text{ if } \left\| \frac{v}{\lambda} \right\|_{2} \leq 1 \end{cases} = \left(1 - \frac{\lambda}{ \left\| v \right\|_{2} } \right)_{+} v $$
Where $ \left( x \right)_{+} = \max \left\{ 0, x \right\} $.