Cantitate/Preț
Produs

Controlled Markov Processes and Viscosity Solutions

Autor Wendell H. Fleming, Halil Mete Soner
en Limba Engleză Hardback – 17 noi 2005
Presents an introduction to optimal stochastic control for continuous time Markov processes and the theory of viscosity solutions. This title covers dynamic programming for deterministic optimal control problems, as well as to the corresponding theory of viscosity solutions.
Citește tot Restrânge

Toate formatele și edițiile

Toate formatele și edițiile Preț Express
Paperback (1) 107162 lei  6-8 săpt.
  Springer – 19 noi 2010 107162 lei  6-8 săpt.
Hardback (1) 107874 lei  6-8 săpt.
  Springer – 17 noi 2005 107874 lei  6-8 săpt.

Preț: 107874 lei

Preț vechi: 131554 lei
-18% Nou

Puncte Express: 1618

Preț estimativ în valută:
19086 22269$ 16686£

Carte tipărită la comandă

Livrare economică 17-31 ianuarie 26

Preluare comenzi: 021 569.72.76

Specificații

ISBN-13: 9780387260457
ISBN-10: 0387260455
Pagini: 448
Ilustrații: XVII, 429 p.
Dimensiuni: 160 x 241 x 30 mm
Greutate: 0.83 kg
Ediția:Second Edition 2006
Editura: Springer
Locul publicării:New York, NY, United States

Public țintă

Research

Cuprins

Deterministic Optimal Control.- Viscosity Solutions.- Optimal Control of Markov Processes: Classical Solutions.- Controlled Markov Diffusions in ?n.- Viscosity Solutions: Second-Order Case.- Logarithmic Transformations and Risk Sensitivity.- Singular Perturbations.- Singular Stochastic Control.- Finite Difference Numerical Approximations.- Applications to Finance.- Differential Games.

Textul de pe ultima copertă

This book is intended as an introduction to optimal stochastic control for continuous time Markov processes and to the theory of viscosity solutions. Stochastic control problems are treated using the dynamic programming approach. The authors approach stochastic control problems by the method of dynamic programming. The fundamental equation of dynamic programming is a nonlinear evolution equation for the value function. For controlled Markov diffusion processes, this becomes a nonlinear partial differential equation of second order, called a Hamilton-Jacobi-Bellman (HJB) equation. Typically, the value function is not smooth enough to satisfy the HJB equation in a classical sense. Viscosity solutions provide framework in which to study HJB equations, and to prove continuous dependence of solutions on problem data. The theory is illustrated by applications from engineering, management science, and financial economics.
In this second edition, new material on applications to mathematical finance has been added. Concise introductions to risk-sensitive control theory, nonlinear H-infinity control and differential games are also included.
Review of the earlier edition:
"This book is highly recommended to anyone who wishes to learn the dinamic principle applied to optimal stochastic control for diffusion processes. Without any doubt, this is a fine book and most likely it is going to become a classic on the area... ."
SIAM Review, 1994

Caracteristici

Provides a luckd introduction to optimal stochastic control for continuous time Markov processes and to the theory of viscosity solutions Also offers a concise introduction to risk-sensitive control theory, nonlinear H-infinity control and differential games Several all-new chapters have been added, and others completely rewritten For the Second Edition, new material has been added on application to mathematical finance