(This used to be a comment, but I think it deserves to be an answer, after mulling over it a bit.)
I don't think your criteria are quite correct. Some counterexamples:
Let $f(x) = - x^2$, and $x(0) = -1$. This ODE blows up in finite time toward $-\infty$. But $\int_{-1}^\infty dx / f(x) $ diverges due to the singularity at $x = 0$.
Similarly, for any $f(x)\geq 0$ such that $f(0) = 0$, for any initial value $x(0) < 0$ we must have $x(0) \leq x(t) \leq 0$ for any $t > 0$. Hence we have global existence (no blow up). But if we take $f(x) = \sqrt{|x|}$ if $|x| \leq 1$ and $f(x) = x^2$ if $|x| > 1$, then $1/f(x)$ is integrable, and in particular $\int_{x_0}^\infty dx/f(x) < \infty$ for any $x_0$.
One key point used in your examples $f(x) = x^2$ and $f(x) = \sqrt{x}$ with initial data positive, is that there are no stationary points. And so for any positive initial datum the evolution eventually goes toward $x\to \infty$, and so the question of blowup reduces to a question of how fast that happens. And for that the integral test is a good one.
Another way of saying this is that you wanted to use the equality
$$ \int_0^s dt = \int_{x(0)}^{x(s)} \frac{dx}{f(x)} $$
but you falsely presupposed that the end state is necessarily $x(s) = \infty$.
This is a partial solution, assuming that $f$ is analytic on $I\times R$, and so
solutions are analytic as well, and that
the space $E$ of solutions is of the same dimension $n$ as
vectors $y$ and $f$.
We represent solutions $y$ and the function $f$ in the right hand side as column vectors, as usual.
Let $y_1,\ldots,y_n$ be a basis in $E$. Then every
vector $y\in E$ can be written as a linear combination
$$y=c_1y_1+\ldots+c_ny_n.\quad\quad\quad\quad\quad\quad\quad\quad (1)$$
Solving this by Cramer's rule we obtain $c_k$ as a ratio of two determinants made of coordinates of $y$ and $y_j$. Since the coordinates of $y$ make one column in the numerator, it is clear that this ratio is a linear function of coordinates of $y$, so we can write
$$c_j=a_j(t)y,\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad\quad (2)$$
for some row vector $a_j$ which depends on $t$.
Now, since all elements of $E$ satisfy your non-linear equation
we must have for every $y\in E$ of the form (1):
$$y'=\sum_{j=1}^n c_jy_j^\prime=\sum_{j=1}^nc_jf(t,y_j),$$
Substituting our formula for $c_j$ we obtain that
$$y'=A(t)y,\quad\mbox{where}\quad A(t)=\sum_jf(t,y_j(t))a_j(t)$$
(column vectors $f(t,y_j(t))$ times row vectors $a_j$ give you
$n\times n$ matrices.)
There is a little problem with the denominator in the Cramer Rule:
one has to show that it cannot be zero at some point.
If all functions $f,y_j$ are analytic, this can only happen at isolated
points. Away from these isolated points $t_k$ both equations
$y'=f(t,y)$ and $y'=A(t)y$ have the same set of solutions, therefore
$f(t,y)=A(t)y$ on some open set of $(t,y)$, and since all functions
are analytic they must coincide.
Best Answer
No. Consider $x'=g(x,y)$, $y'=h(x,y)$. If we take $g(x,y)=2|x|^{1/2}$ for $y=x^2$ similarly $h=4|x|^{3/2}$ on $y=x^2$, then we can check directly that $x=t^2$, $y=t^4$, $t\ge 0$, and $x=y=0$ are solutions with initial value $(0,0)$, independently of how we define $g,h$ off the parabola.
To define $g,h$ everywhere, we interpolate linearly, starting from the value on the parabola, by letting $g(ta,ta^2)=tg(a,a^2)$, $a\in\mathbb R$, $-1\le t\le 1$, and we also set $g(x,0)=g(0,y)=0$.
So far, $g$ has been defined on the closed set $\{ |y|\ge x^2\}\cup \{ (x,0)\}$, and it is continuous there. To verify continuity on the $y$ axis, note that if $(ta,ta^2)\to (0,b)$ with $b\not= 0$, then $|a|\to\infty$, so $tg(a,a^2)=2t|a|^{1/2}\to 0$ (and also $t|a|^{3/2}\to 0$, which we need in the corresponding step for $h$).
The existence of all directional derivatives at the origin is already guaranteed because the current domain contains a line segment in every direction, and $g$ is linear there. Finally, we extend $g$ to $\{ 0<|y|<x^2\}$. We can make $g$ smooth away from the origin.