Abstract This paper concerns with optimal impulsive control problems with trajectories of bounded variation. Necessary optimality conditions based on weakly monotone solutions of the Hamilton Jacobi inequality and feedback controls… Click to show full abstract
Abstract This paper concerns with optimal impulsive control problems with trajectories of bounded variation. Necessary optimality conditions based on weakly monotone solutions of the Hamilton Jacobi inequality and feedback controls are discussed. A particular attention is paid to necessary optimality conditions with feedback controls, called Feedback minimum principle. The latter is generalized the corresponding principle for classical optimal control problems and is formulated in terms of Pontryagin Maximum Principle. An example illustrating these results is considered.
               
Click one of the above tabs to view related content.