I'm not sure, but I think there are two problems with the formula you used to approximate $f''(x_0)$:
it uses the same discretization step for the approximation of the first and second derivative (it's like computing one directional derivative for a function of two variables: it might exist, but that does not imply that the differential exists).
it's a centered finite difference formula, which therefore vanishes for a function that is odd around $x_0$ (or even gives infinity if $f$ is odd but $f(x_0)\neq0$, in which case $f$ is for sure discontinuous).
But I think the idea would work if the increments used in the approximation of the first and second derivative were different and the discretization formulas were not centered. Namely
$$f''(x)\simeq \frac{f'(x+h)-f'(x)}{h}$$
and then
$$f'(x+h)\simeq \frac{f(x+h+k)-f(x+h)}{k}$$
$$f'(x)\simeq \frac{f(x+k)-f(x)}{k}$$
which gives
$$f''(x)\simeq \frac{\dfrac{f(x+h+k)-f(x+h)}{k}-\dfrac{f(x+k)-f(x)}{k}}{h}=$$
$$f''(x)\simeq \frac{f(x+h+k)-f(x+h)-f(x+k)+f(x)}{hk}$$
Now, for the example reported in the link you gave, this formula does not give a finite result as $h,k$ go to $0$ independently.
To summarize, I would say that $f''$ exists and it's equal to $f''(x)$ if
$$\forall \varepsilon>0, \exists \delta>0 : \forall \underline{h}\in\mathbb{R}^2\cap \mathcal{B}(\underline{0},\delta)\setminus\underline{0}$$
$$\left|\frac{f(x+h_1+h_2)-f(x+h_1)-f(x+h_2)+f(x)}{h_1h_2}-f''(x)\right|<\varepsilon.$$
I'm not $100\%$ sure of this statement (in particular of the fact that the two increments have to independent), but it looks right to me. For sure you need not centered schemes though.
To understand this, you need to think of the intuition behind the $\epsilon$-$\delta$ definition. We want $\lim_{x\to a}f(x)=L$ if we can make $f(x)$ as close to $L$ as we like by making $x$ sufficiently close to $a$. Worded differently, we might say that:
$\lim_{x\to a}f(x)=L$ if given any neighborhood $U$ of $L$, there is a neighborhood $V$ of $a$ such that elements of $V$ are mapped by $f$ to elements of $U$ (except possibly $a$ itself).
In this context, a "neighborhood" of a point $p$ should be understood to mean "points sufficiently close to $p$". Let's make that precise by defining what we mean by "close". For $\epsilon>0$ (assumed, but not required, to be very small) define
$$B(x,\epsilon):=\{y\,:\,|x-y|<\epsilon\},$$
the ball of radius $\epsilon$ about $x$. For our purposes, we say $U$ is a neighborhood of $x$ if $U=B(x,\epsilon)$ for some $\epsilon>0$. (The usual definition only requires that $U$ contains such a ball.) Assuming $\epsilon>0$ is very small, this agrees with our intuition of what closeness should mean. Now if we go back to our neighborhood "definition" of a limit, you should be able to think about it for a bit and convince yourself that it is equivalent to the usual definition.
How does this relate to the problem with infinity? Given that infinity is not a real number (and things like distance from infinity do not make sense), we must revise what it means to be "close" to infinity. So for $M>0$ (assumed this time to be very large) define
$$B(+\infty,M):=\{y\,:\,y>M\},\quad B(-\infty,M):=\{y\,:\,y<-M\},$$
the neighborhoods of $\pm\infty$. Hopefully you can see why these make sense as definitions; a number should be close to infinity if it is very large (with the correct sign), so a neighborhood of infinity should contain all sufficiently large numbers.
Now we extend our neighborhood definition of limits to include the case where $a$ or $L$ can be $\pm\infty$. It is a similar exercise to before to verify now that the definition is still equivalent to the old one, only now we have in some sense unified somewhat.
Best Answer
You do know the definition of limit, right? So, just apply it. We can argue that if the derivative at $x$ is $L$, then $$\forall \epsilon > 0 ~\exists \delta > 0 \;\forall \Delta x: \\ 0<|\Delta x| < \delta \implies \left|\frac{f(x + \Delta x) - f(x)}{\Delta x}-L\right| < \epsilon$$ Also, the derivative as a function, $f^\prime (x)$ is simply a function which takes a point in, and spits out the derivative of $f$ at that point. So, you can also define definition of a derivative at a point $c$ and collect all those derivatives under a function.