Let us begin with a lemma: if $g \in L^1([0,1])$ and $t \in [0,1]$ then, for every $n \ge 1$,
$$\int _0 ^t \int _0 ^{s_1} \dots \int _0 ^{s_{n-1}} g(s_1) \ g(s_2) \dots g(s_n) \ \mathrm d s_n \dots \mathrm d s_2 \ \mathrm d s_1 = \frac 1 {n!} \left( \int _0 ^t g(s) \ \mathrm d s \right) ^n \ .$$
The proof is entirely analogous to the one here; its core idea is that the cube $[0,t]^n$ (which is the domain of integration of the right-hand side) is the union of $n!$ simplices identical to the one described by $\{(s_1, \dots, s_n) \in \mathbb R^n \mid 0 \le s_n \le \dots \le s_1 \le t\}$ (which is the domain of integration of the left-hand side), and these simplices overlap only on the boundaries. Furthermore, the factors in $g(s_1) \dots g(s_n)$ all commute.
Now, on to the main result: we want to prove that if $y(0) = 0$ and $y$ is absolutely continuous, then $y=0$ on $[0,1]$. To this end, notice that integrating the differential equation gives $y(t) = \int _0 ^t g(s_1) y(s_1) \ \mathrm d s_1$. Replacing the inner $y(s_1)$ by its corresponding integral, one gets $y(t) = \int _0 ^t \int _0 ^{s_1} g(s_1) \ g(s_2) y(s_2) \ \mathrm d s_2 \ \mathrm d s_1$, and inductively
$$y(t) = \int _0 ^t \int _0 ^{s_1} \dots \int _0 ^{s_{n-1}} g(s_1) \ g(s_2) \dots g(s_n) \ y(s_n) \ \mathrm d s_n \dots \mathrm d s_2 \ \mathrm d s_1 \ .$$
Making now repetead use of the usual inequality $| \int _a ^b f | \le \int _a ^b |f|$ we bring the modulus function inside all the integrals, getting
$$|y(t)| \le \int _0 ^t \int _0 ^{s_1} \dots \int _0 ^{s_{n-1}} | g(s_1) \ g(s_2) \dots g(s_n) \ y(s_n) | \ \mathrm d s_n \dots \mathrm d s_2 \ \mathrm d s_1 \le \dots$$
Since $y$ is continuous on the compact interval $[0,1]$, $|y|$ will have a finite upper bound $M$, which allows us to continue the above (also using the lemma):
$$\dots \le M \int _0 ^t \int _0 ^{s_1} \dots \int _0 ^{s_{n-1}} | g(s_1) \ g(s_2) \dots g(s_n) | \ \mathrm d s_n \dots \mathrm d s_2 \ \mathrm d s_1 \le M \frac 1 {n!} \left( \int _0 ^t |g(s)| \ \mathrm d s \right) ^n \ .$$
To conclude, if $I = \int _0 ^1 |g(s)| \ \mathrm d s$, we may deduce the inequality
$$|y(t)| \le M \frac {I^n} {n!} \quad \forall n \ge 1 \ .$$
But this is great, because the right-hand side converges to $0$, which means that $y(t) = 0$ for all $t \in [0,1]$, which proves uniqueness!
Just in case that you do not know why $\lim _n \frac {I^n} {n!} = 0$ when $I \ge 0$, notice that if $x_n = \frac {I^n} {n!}$ then $x_{n+1} = \frac x {n+1} x_n$, which shows that the sequence $(x_n) _{n \ge x-1}$ decreases. Since $x_n \ge 0$ for all $n$, we deduce that $(x_n)$ has a finite limit $L \ge 0$. Passing to the limit in the recurrence relation gives $L = 0 \cdot L$, whence $L=0$.
Note first that a first order ODE only has one initial condition. With two initial conditions you get a condition on the coefficients on the right side, for instance $$c=\dot u(0)^2-au(0)^2-bu(0).$$
By your equation, the points lie on a quadratic curve, either an ellipse for $a<0$ or a hyperbolic curve for $a>0$. In every case you can achieve a "normal form" with $b=0$ with a constant shift of $u$ by setting $\tilde u=u+\frac{b}{2a}$, for which you would get the equation $$\dot{\tilde u}^2=a\tilde u^2+c-\frac{b^2}{4a}.$$
Renaming the variables we can consider the reduced or normalized case
$$
\dot u^2=au^2+c
$$
Depending on the signs of the coefficients, a different parametrization of the quadratic curve must be chosen. This then results in a simplified ODE.
Case $a=-w^2<0$. Then necessarily $c=r^2>0$. Parametrize $$\dot u^2+(wu)^2=r^2$$ with trigonometric functions, $wu(t)=r\sin v(t)$, $\dot u(t)=r\cos v(t)$. This parametrization has to be compatible with the differential relation, thus for $\cos v(t)\ne 0$ $$wr\cos v=w\dot u=r\cos(v)\dot v\implies \dot v=w.$$ So you get a solution $u(t)=\frac rw\sin(wt+\phi)$. At points $\cos v(t)=0$ the solution can switch to stay stationary for some time.
Case $a=w^2>0$, $c=r^2>0$. This repeats the above method, only with hyperbolic instead of trigonometric functions. $u(t)=\frac{r}{w}\sinh(wt+ϕ)$.
Case $a=w^2>0$, $c=-r^2<0$. Now $r^2=w^2u^2-\dot u^2$ has to be parametrized as $wu=r\cosh v$, $\dot u=r\sinh v$ leading to branching points whenever $v=0$, else $$wr\sinh v=w\dot u=r\sinh(v)\dot v$$ so that one concludes again $u(t)=\frac{r}{w}\cosh(wt+ϕ)$.
Best Answer
Here is verification that the function satisfies the DE. Uniqueness depends on the function $f$ being "nice enough".
When you differentiate the integral, you have to treat it like a product since the variable you are differentiating w.r.t. ($t$), shows up in both the integrand and limits.