LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Singular optimal controls of stochastic recursive systems and Hamilton–Jacobi–Bellman inequality

Photo by josephtpearson from unsplash

In this paper, we study the optimal singular controls for stochastic recursive systems, in which the control has two components: the regular control, and the singular control. Under certain assumptions,… Click to show full abstract

In this paper, we study the optimal singular controls for stochastic recursive systems, in which the control has two components: the regular control, and the singular control. Under certain assumptions, we establish the dynamic programming principle for this kind of optimal singular controls problem, and prove that the value function is a unique viscosity solution of the corresponding Hamilton-Jacobi-Bellman inequality, in a given class of bounded and continuous functions. At last, an example is given for illustration.

Keywords: controls stochastic; recursive systems; stochastic recursive; bellman inequality; jacobi bellman; hamilton jacobi

Journal Title: Journal of Differential Equations
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.