Stability of closed-loop nonlinear system

control theorynonlinear systemstability-theory

I would like to understand if it is possible to demonstrate the stability of this closed-loop nonlinear system

$$a b -K_1 u = a K_2 \dot{a}$$

where $a$ is the variable I am trying to control, $u$ is the control law, and the nonlinearity is represented by the fact that $b=f(a)$, and by the multiplication of $a$ with $\dot{a}$ on the right hand side of the equation.

The control law I am using is

$$ u= \frac{[ k_p(a^*-a) + k_i\psi+b] a}{K_1} \\ \dot{\psi}=a^*-a$$

where $k_p$ and $k_i$ are the coefficients of a proportional-integral controller.

I am not a control expert, so I apologise in advance for any imprecision or mistake I might have made in formulating the question.

Best Answer

The proposed control law makes the resulting dynamics linear. Namely, when substituting the control law in the dynamics one can factor out $a$ which gives

$$ \dot{a} = -\frac{k_p}{K_2}(a^*-a) - \frac{k_i}{K_2} \psi. $$

By using that $\dot{\psi}=a^*-a$ and that $a^*$ is constant, it follows that the second derivative of $\psi$ with respect to time would be $\ddot{\psi}=-\dot{a}$. Substituting the expression for $\dot{a}$ in yields

$$ \ddot{\psi} = \frac{k_p}{K_2} \dot{\psi} + \frac{k_i}{K_2} \psi, $$

which is linear and can be shown to be stable if $\frac{k_p}{K_2},\frac{k_i}{K_2}<0$.

Related Question