Numerical Methods for Stochastic Control Problems in Continuous Time (Stochastic Modelling and Applied Probability)

Numerical Methods for Stochastic Control Problems in Continuous Time (Stochastic Modelling and Applied Probability) image
ISBN-10:

1468404431

ISBN-13:

9781468404432

Edition: Reprint
Released: Jan 01, 2012
Publisher: Springer
Format: Paperback, 449 pages
to view more data

Description:

The book presents a comprehensive development of effective numerical methods for stochastic control problems in continuous time. The process models are diffusions, jump-diffusions or reflected diffusions of the type that occur in the majority of current applications. All the usual problem formulations are included, as well as those of more recent interest such as ergodic control, singular control and the types of reflected diffusions used as models of queuing networks. Convergence of the numerical approximations is proved via the efficient probabilistic methods of weak convergence theory. The methods also apply to the calculation of functionals of uncontrolled processes and for the appropriate to optimal nonlinear filters as well. Applications to complex deterministic problems are illustrated via application to a large class of problems from the calculus of variations. The general approach is known as the Markov Chain Approximation Method. Essentially all that is required of the approximations are some natural local consistency conditions. The approximations are consistent with standard methods of numerical analysis. The required background in stochastic processes is surveyed, there is an extensive development of methods of approximation, and a chapter is devoted to computational techniques. The book is written on two levels, that of practice (algorithms and applications), and that of the mathematical development. Thus the methods and use should be broadly accessible.











We're an Amazon Associate. We earn from qualifying purchases at Amazon and all stores listed here.