The intuitive definition is just intuitive, mathematically it is wrong since it cannot differentiate between $\delta$ and $5 \delta$.
Formally, $\delta$ is understood as a function of functions:
$$\delta(f)=f(0)$$
Your relation
\begin{equation}
\frac{\mathrm d^2}{\mathrm dx^2} \big| x \big| = 2 \delta .
\end{equation}
means that for all functions $f$ which are infinitely many times differentiable and have compact support we have
$$
\int_{-\infty}^\infty f''(x) |x| dx = 2 f(0) = 2 \delta(f)
$$
Added clarification If $g$ is a twice differentiable function, with $g''$ continuous, the for all functions $f$ which are infinitely many times differentiable and have compact support we have
$$
\int_{-\infty}^\infty g''(x) f(x) dx =\int_{-\infty}^\infty g(x) f''(x) dx
$$
Note that this identity doesn't necessary hold if $f$ is not compactly supported. This is the reason why we always test distributions on compactly supported functions.
The idea of distributions is the following:
Consider a function which is linear
$$u : C_c^\infty(\mathbb R ^d) \to \mathbb C$$
and continuous (we will ignore the topology we consider).
$\delta$ is one such distribution. If we have a continuous (integrable) and (locally) bounded function $g$ we can interpret it as a distribution the following way:
$$u_g(f) = \int_{-\infty}^\infty f(x) g(x) dx $$
This definition makes sense because $f$ has compact support.
Now, if $g$ is a function which is differentiable everywhere, we get by parts (and using support of $f$ compact):
$$\int_{-\infty}^\infty f(x)g'(x) dx =- \int_{-\infty}^\infty f'(x)g(x) dx$$
or
$$u_{g'}(f)=-u_g(f')$$
This suggest that we can differentiate any distribution:
$$u'(f):= -u(f')$$
With this definition $\delta$ is derivative of the $H(x)=1$ when $x \geq 0$ and $H(x)=0$ when $x <0$. Moreover the derivative of $\delta$ is
$$f \to -f'(0)$$
The concept of differentiability as distribution coincides with the classical concept for differentiable functions. But in this more general sense, many new functions become differentiable.
Note that much of what I wrote only makes sense if we use compactly supported functions....
Finally, there is a theorem which says that if $g$ has continuous derivative on $(0, \infty)$ and $(- \infty, 0)$ and has a jump discontinuity at $x=0$ then, in distribution sense the derivative is
$$g'+C \delta$$
where $g'$ is the derivative on $\mathbb R \backslash \{ 0\}$ and $C$ is the jump at $0$. Applying this result twice to $|x|$ you get the above formula.
If you draw $|x|$ then you should see the derivative does depend on the direction you approach the point from. Trying to assign it a numerical value does not make sense. Your expressions are correct although you could try to assign the derivative to be infinite at $x=0$.
For your second point, the chain rule is missing from your calculation,
$$
\frac{d\theta(-x)}{dx} = \left.\frac{d(-x)}{dx} \frac{d\theta(y)}{dy}\right\vert_{y=-x} = - \delta(y)\vert_{y=-x} = -\delta(-x)=-\delta(x) .
$$
Best Answer
No they are not. The absolute value function is differentiable at $x$ if and only if $x\ne0$. And, when that happens, the derivative is indeed $\operatorname{sgn}(x)$ or $\frac x{|x|}$. But, since the derivative of the absolute value function is undefined at $0$, but $\operatorname{sgn}$ is defined at that point, those two functions have distinct domains and therefore they cannot possibly be equal.