Nonlinear control
Nonlinear control theory is the area of control theory which deals with systems that are nonlinear, time-variant, or both. Control theory is an interdisciplinary branch of engineering and mathematics that is concerned with the behavior of dynamical systems with inputs, and how to modify the output by changes in the input using feedback, feedforward, or signal filtering. The system to be controlled is called the "plant". One way to make the output of a system follow a desired reference signal is to compare the output of the plant to the desired output, and provide feedback to the plant to modify the output to bring it closer to the desired output.
Control theory is divided into two branches.
Nonlinear control theory covers a wider class of systems that do not obey the superposition principle. It applies to more real-world systems, because all real control systems are nonlinear. These systems are often governed by
An example of a nonlinear control system is a thermostat-controlled heating system. A building heating system such as a furnace has a nonlinear response to changes in temperature; it is either "on" or "off", it does not have the fine control in response to temperature differences that a proportional (linear) device would have. Therefore, the furnace is off until the temperature falls below the "turn on" setpoint of the thermostat, when it turns on. Due to the heat added by the furnace, the temperature increases until it reaches the "turn off" setpoint of the thermostat, which turns the furnace off, and the cycle repeats. This cycling of the temperature about the desired temperature is called a limit cycle, and is characteristic of nonlinear control systems.
Properties of nonlinear systems
Some properties of nonlinear dynamic systems are
- They do not follow the principle of superposition (linearity and homogeneity).
- They may have multiple isolated equilibrium points.
- They may exhibit properties such as limit cycle, bifurcation, chaos.
- Finite escape time: Solutions of nonlinear systems may not exist for all times.
Analysis and control of nonlinear systems
There are several well-developed techniques for analyzing nonlinear feedback systems:
- Describing function method
- Phase plane method
- Lyapunov stability analysis
- Singular perturbation method
- The Popov criterion and the circle criterion for absolute stability
- Center manifold theorem
- Small-gain theorem
- Passivity analysis
Control design techniques for nonlinear systems also exist. These can be subdivided into techniques which attempt to treat the system as a linear system in a limited range of operation and use (well-known) linear design techniques for each region:
Those that attempt to introduce auxiliary nonlinear feedback in such a way that the system can be treated as linear for purposes of control design:
And Lyapunov based methods:
Nonlinear feedback analysis – The Lur'e problem
An early nonlinear feedback system analysis problem was formulated by A. I. Lur'e. Control systems described by the Lur'e problem have a forward path that is linear and time-invariant, and a feedback path that contains a memory-less, possibly time-varying, static nonlinearity.
The linear part can be characterized by four matrices (A,B,C,D), while the nonlinear part is Φ(y) with (a sector nonlinearity).
Absolute stability problem
Consider:
- (A,B) is controllable and (C,A) is observable
- two real numbers a, b with a < b, defining a sector for function Φ
The Lur'e problem (also known as the absolute stability problem) is to derive conditions involving only the transfer matrix H(s) and {a,b} such that x = 0 is a globally uniformly asymptotically stable equilibrium of the system.
There are two well-known wrong conjectures on the absolute stability problem:
- The Aizerman's conjecture
- The Kalman's conjecture.
Graphically, these conjectures can be interpreted in terms of graphical restrictions on the graph of Φ(y) x y or also on the graph of dΦ/dy x Φ/y.
There are two main theorems concerning the Lur'e problem which give sufficient conditions for absolute stability:
- The circle criterion (an extension of the Nyquist stability criterion for linear systems)
- The Popov criterion.
Theoretical results in nonlinear control
Frobenius theorem
The
where , are vector fields belonging to a distribution and are control functions, the integral curves of are restricted to a manifold of dimension if and is an involutive distribution.
See also
References
- ^ trim point
- S2CID 59553748.
Further reading
- Lur'e, A. I.; Postnikov, V. N. (1944). "К теории устойчивости регулируемых систем" [On the Theory of Stability of Control Systems]. Prikladnaya Matematika I Mekhanika (in Russian). 8 (3): 246–248.
- Vidyasagar, M. (1993). Nonlinear Systems Analysis (2nd ed.). Englewood Cliffs: Prentice Hall. ISBN 978-0-13-623463-0.
- Isidori, A. (1995). Nonlinear Control Systems (3rd ed.). Berlin: Springer. ISBN 978-3-540-19916-8.
- Khalil, H. K. (2002). Nonlinear Systems (3rd ed.). Upper Saddle River: Prentice Hall. ISBN 978-0-13-067389-3.
- Brogliato, B.; Lozano, R.; Maschke, B.; Egeland, O. (2020). Dissipative Systems Analysis and Control (3rd ed.). London: Springer.
- Leonov G.A.; Kuznetsov N.V. (2011). "Algorithms for Searching for Hidden Oscillations in the Aizerman and Kalman Problems" (PDF). Doklady Mathematics. 84 (1): 475–481. S2CID 120692391.
- Bragin V.O.; Vagaitsev V.I.; Kuznetsov N.V.; Leonov G.A. (2011). "Algorithms for Finding Hidden Oscillations in Nonlinear Systems. The Aizerman and Kalman Conjectures and Chua's Circuits" (PDF). Journal of Computer and Systems Sciences International. 50 (5): 511–543. S2CID 21657305.
- Leonov G.A., Kuznetsov N.V. (2011). Sergio, Bittanti (ed.). "Analytical-numerical methods for investigation of hidden oscillations in nonlinear control systems" (PDF). IFAC Proceedings Volumes (IFAC-PapersOnline). Proceedings of the 18th IFAC World Congress. 18 (1): 2494–2505. ISBN 9783902661937.