Applications of Lyapunov Type Functions for Optimization Problems in Impulsive Control Systems
This paper deals with an application of Lyapunov type functions for optimality conditions of impulsive processes. A impulsive optimal control problem with trajectories of bounded variation and impulsive controls (regular vector measures) is considered. The problem under consideration is characterized by two main features. First, the dynamical control system is linear with respect to the impulsive control and may have not the so-called well-posedness property of Frobenius type. Second, there are intermediate state constraints on the one-sided limits of the trajectory at fixed instants of time. Sufficient optimality conditions corresponding to the Hamilton–Jacobi canonical optimality theory are presented. These optimality conditions involve some sets of Lyapunov type functions. These functions are strongly monotone solutions of the corresponding proximal Hamilton-Jacobi inequalities. Moreover, we introduce compound(defined piecewise in the variable t) Lyapunov type functions, which are more applicable for dynamical systems with discontinuous trajectories and intermediate state constraints. Examples illustrating the optimality conditions are discussed.
1. Gurman V.I. The Extension Principle in Optimal Control Problems [Printsip rasshireniya v zadachakh optimal’nogo upravleniya]. Moscow, Nauka, 1997.
2. Dykhta V.A., Samsonyuk O.N. Optimal Impulsive Control with Applications [Optimal’noe impul’snoe upravlenie s prilozheniyami]. Moscow, Fizmatlit, 2000.
3. Dykhta V.A., Samsonyuk O.N. A Maximum Principle for Smooth Optimal Impulsive Control Problems with Multipoint State Constraints. Comput. Math. Math. Phys., 2009, vol. 49, pp. 942-957.
4. Dykhta V.A., Samsonyuk O.N. Hamilton-Jacobi Inequalities in Control Problems for Impulsive Dynamical Systems. Proc. of the Steklov Institute of Mathematics, 2010, vol. 271, pp. 86-102.
5. Dykhta V.A., Samsonyuk O.N. Canonical Theory of Optimality for Impulsive Processes [Kanonicheskaya teoriya optimal’nosti impul’snyh processov]. Sovrem. Mat. Fundament. Napravl., 2011, vol. 42, pp. 118-124.
6. Zavalishchin S.T., Sesekin A.N. Impulse Processes: Models and Applications [Impul’snye processy: modeli i prilozhenija]. Moscow, Nauka, 1991.
7. Miller B.M. Optimality Condition in the Control Problem for a System Described by a Measure Differential Equation. Autom. Remote Control, 1982, vol. 43, no. 6, part 1, pp. 752–761.
8. Miller B.M. Conditions for the Optimality in Problems of Generalized Control. I, II, Autom. Remote Control, 1992, vol. 53, no 3, part 1, pp. 362-370 no 4, pp. 505–513.
9. Miller B.M. Method of Discontinuous Time Change in Problems of Control of Impulse and Discrete-Continuous Systems. Autom. Remote Control, 1993, vol. 54, no 12, part 1, pp. 1727-1750.
10. Miller B.M., Rubinovich E.Ya. Optimization of Dynamic Systems with Impulsive Controls [Optimizatsiya dinamicheskikh sistem s impul’snymi upravleniyami]. Moscow, Nauka, 2005.
11. Samsonyuk O.N. Compound Lyapunov Type Functions in Control Problems of Impulsive Dynamical Systems [Sostavnye funkcii tipa Lyapunova v zadachah upravleniya impul’snymi dinamicheskimi sistemami]. Trudy Inst. Mat. Mekh. UrO RAN, Ekaterinburg, vol. 16, pp. 170-178.
12. Samsonyuk O.N. (2014). Monotonicity of Lyapunov Type Functions for Impulsive Control Systems [Monotonost’ funkcii tipa Lyapunova dlya impul’snykh upravlyaemihk sistem]. The bulletin of Irkutsk state university. Mathematics, 2014, vol. 7, pp. 104-123.
13. Samsonyuk O.N. Invariant Sets for the Nonlinear Impulsive Control Systems. Autom. Remote Control, 2015, vol. 76, no 3, pp. 405–418.
14. Sesekin A.N. On the Set of Discontinuous Solutions of Nonlinear Differential Equations. Izv. Vyssh. Uchebn. Zaved., Mat., 1994, vol. 38, no 6, pp. 83-89.
15. Sesekin A.N. Dynamical Systems with Nonlinear Impulsive structure [Dinamicheskie sistemy s nelinejnoj impul’snoj strukturoj]. Trudy Inst. Mat. Mekh. UrO RAN, Ekaterinburg, 2000, vol. 6, pp. 497-510.
16. Stephanova A.V. The Hamilton-Jacobi-Bellman Equation in a Nonlinear Impulse Control Problem [Uravnenie Gamiltona-Yakobi-Bellmana dlya nelinejnyh impulsnyh upravlyaemyh sistem]. Trudy Inst. Mat. Mekh. UrO RAN, Ekaterinburg, 1998, vol. 5, pp. 301-318.
17. Clarke F.H., Ledyaev Yu.S., Stern R.J., Wolenski P.R. Nonsmooth Analysis and Control Theory. New York, Springer-Verlag, 1998.
18. Code W.J., Silva, G.N. Closed Loop Stability of Measure-driven Impulsive Control Systems. J. Dyn. Cont. Syst., 2010, vol. 16, pp. 1-21.
19. Daryin A.N., Kurzhanski A.B. Dynamic Programming for Impulse Control. Ann. Reviews in Control, 2008, vol. 32, pp. 213-227.
20. Dykhta V., Samsonyuk O. Some Applications of Hamilton-Jacobi Inequalities for Classical and Impulsive Optimal Control Problems. European Journal of Control, 2011, vol. 17, pp. 55-69.
21. Fraga S.L., Pereira F.L. On the Feedback Control of Impulsive Dynamic Systems. Proc. of 47th IEEE Conference on Decision and Control, 2008, pp. 2135-2140.
22. Matos A.C., Pereira F.L. Hamilton-Jacobi Conditions for a Measure Differential Control Problem. Proc. of XII Baikal International conference on Methods of optimization and its applications, Irkutsk, Russia, 2001, pp. 237-245.
23. Miller B.M. The Generalized Solutions of Nonlinear Optimization Problems with Impulse Control. SIAM J. Control Optim., 1996, vol. 34, pp. 1420-1440.
24. Motta M., Rampazzo F. Space-time Trajectories of Nonlinear Systems Driven by Ordinary and Impulsive Controls. Differential Integral Equations, 1995, vol. 8, pp. 269-288.
25. Motta M., Rampazzo F. Dynamic Programming for Nonlinear Systems Driven by Ordinary and Impulsive Control. SIAM J. Control Optim., 1996, vol. 34, pp. 199-225.
26. Pereira F.L., Silva G.N. Stability for Impulsive Control Systems. Dynamical Systems, 2002, vol. 17, pp. 421-434.
27. Vinter R.B. Optimal Control. Birkhauser, Boston, 2000.