Abstract In this paper, we study two kinds of singular optimal controls (SOCs for short) problems where the systems governed by forward-backward stochastic differential equations (FBSDEs for short), in which… Click to show full abstract
Abstract In this paper, we study two kinds of singular optimal controls (SOCs for short) problems where the systems governed by forward-backward stochastic differential equations (FBSDEs for short), in which the control has two components: the regular control, and the singular one. Both drift and diffusion terms may involve the regular control variable. The regular control domain is postulated to be convex. Under certain assumptions, in the framework of the Malliavin calculus, we derive the pointwise second-order necessary conditions for stochastic SOC in the classical sense. This condition is described by two adjoint processes, a maximum condition on the Hamiltonian supported by an illustrative example. A new necessary condition for optimal singular control is obtained as well. Besides, as a by-product, a verification theorem for SOCs is derived via viscosity solutions without involving any derivatives of the value functions. It is worth pointing out that this theorem has wider applicability than the restrictive classical verification theorems. Finally, we focus on the connection between the maximum principle and the dynamic programming principle for such SOCs problem without the assumption that the value function is smooth enough.
               
Click one of the above tabs to view related content.