$\newcommand{\CC}{{\mathbb{C}}}\newcommand{\RR}{{\mathbb{R}}}\newcommand{\ra}{\rightarrow}\newcommand{\ds}{\displaystyle}$The implication is not true without additional conditions upon the functions.
It is not even true for $f,h$ independent of a function $u$. Therefore the problem is not
(only) the compactness in the space of bounded functions.
Let me first recall a counterexample of stability theory that impressed me very much as a student.
Consider $r : \RR \times [0,1] \ra \RR$ given by
$$r(t,a) = \frac{1+2a t^4}{1+t^2+a^3 t^6} \ \ \mbox{ for }t \in\RR, a \in [0,1].$$
Then $r$ is a positive function of class ${\mathcal C}^\infty$ and we have
$\ds\lim_{t \ra +\infty} r(t,a) = 0\mbox{ for }a \in [0,1].$
The convergence is not uniform with respect to $a \in [0,1]$, however, because
$\ds\lim_{a \ra 0} a^{1/2} r(a^{-3/4},a) = 1.$
Now define $s : \RR^3 \ra \RR^2$ by
$$s(t,x) = \left(\frac{1}{r} \frac{\partial r}{\partial t}\right)
\left(t,\frac{x^2 _2}{x^2 _1 + x^2 _2}\right) \cdot x {\rm ,if} \ \ x = (x_1, x_2) \neq 0,$$
whereas $s(t,(0,0))=0$.
$s$ is continuous and satisfies a local Lipschitz condition with respect to $x$
(In the neighborhood of $x=(0,0)$, this is a bit tricky to verify. For the uniqueness of solutions to the initial value problems
this is not needed, anyway). Then the initial value problem
$$z' = s(t,z),\ \ z(t_0) = b\mbox{ with }t_0\in\RR, b \in \RR^2$$
has a unique solution and for $b= (b_1,b_2)\neq 0$ this solution is given by
$$z(t) = \frac{r(t,a_0)}{r(t_0, a_0)} b \ \ {\rm with} \ \
a_0 = \frac{b^2 _2}{b^2 _1 + b^2 _2}.$$
Therefore all solutions of the differential equation tend to $0$ as $t\to+\infty$, but given $\varepsilon,M>0$, $t_0\in\RR$
there exists $b\in\RR^2$, $|b|<\varepsilon$ such that
the solution of $z' = s(t,z),\ \ z(t_0) = b$ satisfies $||z||_\infty>M$. It is sufficient to choose
first $a_0>0$ small enough such that $r(a_0^{-3/4},a_0)>2M\,r(t_0,a_0)/\varepsilon$, then $B$ with $|B|=1$ such that
$a_0=\frac{B^2 _2}{B^2 _1 + B^2 _2}$ and finally $b=\frac\varepsilon2B$.
Now we adapt this example to the given $x,y$-system; more precisely, to one independent of $u$.
With the function $s$ defined above, we put essentially $t=x_3$:
$$f:\RR^3\to\RR^3,\ f(x_1,x_2,x_3)=(s(x_3,(x_1,x_2)),1)\mbox{ and }h:\RR^3\to\RR,\ h(x_1,x_2,x_3)=x_1^2+x_2^2.$$
The solution of $\dot x=f(x)$, $x(0)=x_0=(x_{01},x_{02},x_{03})$ is then $x_3(t)=x_{03}+t$ and $z(u)=(x_1(u-x_{03}),x_2(u-x_{03}))$
satisfies
$$\frac{dz}{du}=s(u,z),\ \ z(x_{03})=(x_{01},x_{02}).$$
As seen above, for all solutions $x(t)$, the output $y(t)=x_1^2(t)+x_2^2(t)=||z(t+x_{03})||_2^2$ is bounded, but given $\varepsilon,M>0$, $x_{03}\in\RR$
there exists $(x_{01},x_{02})\in\RR^2$, of norm smaller than $\varepsilon$ such that
the solution of $\dot x=f(x),\ x(0)=x_0, y=h(x)$ satisfies $||y||_\infty>M$.
Therefore a function $g$ as desired in the question cannot exist.
Of course, the function $f$ in this example is not very smooth, but it seems to me that smoother examples could also be constructed.
The computation of the equilibrium points is not correct. Since you do not give any assumptions on the parameters, let us suppose that $\alpha,\beta,\gamma$ are non zero. you need to solve simultaneously
\begin{equation}
\alpha x-\beta xy=0; \qquad \beta xy-\gamma y=0.
\end{equation}
The point $(0,0)$ is clearly an equilibrium point. Observe that $x=0$ implies $\gamma y=0$ and that $y=0$ implies $\alpha x=0$. Thus, your last two equilibrium points are not correct.
Now, suppose $(x,y)$ is not the origin. Then $(x,y)=(\gamma/\beta,\alpha/\beta)$ is the other equilibrium point.
The Jacobian is the matrix
\begin{equation}
J=\begin{bmatrix}\alpha-\beta y & -\beta x\\
\beta y & \beta x-\gamma
\end{bmatrix},
\end{equation}
then
\begin{equation}
J|_{(0,0)}=\begin{bmatrix}\alpha& 0\\
0 & -\gamma
\end{bmatrix}; \qquad J|_{(\gamma/\beta,\alpha/\beta)}=\begin{bmatrix}0 & -\gamma\\
\alpha & 0
\end{bmatrix}
\end{equation}
You could construct the following table, depending on the eigenvalues of $J$ at each point. Let $p_1=(0,0)$, $p_2=(\gamma/\beta,\alpha/\beta)$
\begin{array}{|c|c|c|c|}
\hline
\alpha & \beta & \gamma & p_1 & p_2 \\ \hline
+ & + & + & saddle & center \\ \hline
+ & + & - & source & saddle \\ \hline
+ & - & + & saddle & center \\ \hline
+ & - & - & source & saddle \\ \hline
- & + & + & sink & saddle \\ \hline
- & + & - & saddle & center \\ \hline
- & - & + & sink & saddle \\ \hline
- & - & - & saddle & center \\ \hline
\end{array}
And then, for example, have the following phase portraits (at least for the first 2 cases, you could try to do the rest)
Best Answer
Consider the Lorenz equations:
$$x' = f(x,y,z) = \sigma(y-x)$$
$$y' = g(x,y,z) = \rho x - y -xz$$
$$z' = h(x,y,z)= -\beta z + xy$$
where $\sigma$ and $\beta$ are viewed as fixed positive constants and $\rho$ is a parameter.
Here is a guide to work through given that the Wiki describes what happens at each of the critical points. The outline of how to derive those is as follows:
Clearly, a critical point is given by $(x, y, z) = (0,0,0)$.
The Jacobian matrix (using partial derivatives) is given by:
$$\tag 2 \displaystyle J = \begin{bmatrix} \frac{\partial f}{\partial x} & \frac{\partial f}{\partial y}& \frac{\partial f}{\partial z} \\\frac{\partial g}{\partial x} & \frac{\partial g}{\partial y}& \frac{\partial g}{\partial z}\\ \frac{\partial h}{\partial x} & \frac{\partial h}{\partial y}& \frac{\partial h}{\partial z}\end{bmatrix} = \begin{bmatrix}-\sigma & \sigma & 0\\\rho - z & -1 & -x\\ y & x & -\beta\end{bmatrix}$$
$$\displaystyle J_{0,0,0} = \begin{bmatrix}-\sigma & \sigma & 0\\\rho & -1 & 0\\ 0 & 0 & -\beta\end{bmatrix}$$
To find the eigenvalues, we set-up and solve for the characteristic polynomial using:
$$| A - \lambda I| = 0 \rightarrow -\beta \lambda^2-\beta \lambda+\beta r \sigma-\beta \lambda \sigma-\beta \sigma-\lambda^3-\lambda^2+\lambda \rho \sigma-\lambda^2 \sigma-\lambda \sigma = 0$$
This leads to the three eigenvalues (note, we could have written these straight off given the special form of the diagonal matrix above) and their corresponding eigenvectors as:
$\displaystyle \lambda_1 = -\beta, v_1 = (0, 0, 1)$
$\displaystyle \lambda_2 = \frac{1}{2} \left(-\sqrt{4 \rho \sigma+\sigma^2-2 \sigma+1)}-\sigma-1\right), v_2 = -\frac{-1+\sigma+\sqrt{1-2 \sigma+4 \rho \sigma+\sigma^2}}{2 \rho}, 1, 0)$
$\displaystyle \lambda_3 = \frac{1}{2}\left(\sqrt{4 \rho \sigma+\sigma^2-2 \sigma+1)}-\sigma-1\right), v_3 = -\frac{-1+\sigma-\sqrt{1-2 \sigma+4 \rho \sigma+\sigma^2}}{2 \rho}, 1, 0)$
Of course those $\lambda$ are the eigenvalues we use to do a stability analysis.
To study the stability of this, we need some complex center manifold theory, so I am going to stay away from that. It leads to a pitchfork bifurcation at the origin.
It is worth noting that there are two other pairs of fixed points at:
To find this, notice that the first equation gives us $y=x$, substituting this into the second equation gives $x(\rho - 1) - xz = 0 \rightarrow z = \rho-1$, and we substitute these previous two values into the third and have $x^2 = \beta z \rightarrow x = \pm \sqrt{\beta(\rho - 1)}$.
To simplify matters, lets linearize the system by removing the nonlinear terms and analyze the simpler system.
The linearization (eliminate the $xy$ and $xz$ nonlinear terms from the system) is given by:
$$x' = \sigma(y-x)$$
$$y' = \rho x - y$$
$$z' = -\beta z$$
If you look at the equation for $z'$, you notice that it is decoupled from the other two equations and it can be solved outright as $z(t) = c_1 e^{-\beta t}$ and $z(t) \rightarrow 0$ exponentially fast. The other two directions are governed by the system:
$$\begin{bmatrix}x'\\y'\end{bmatrix} = \begin{bmatrix}-\sigma & \sigma\\\rho & -1\end{bmatrix}\begin{bmatrix}x\\y\end{bmatrix}$$
You should be able to analyze this system from all the data provided now.
As you can see, these systems are very complex and include a rich set of mathematics to deal with, so it takes time to get there.
Here is another guide with some of the details for you to work out.