Approximate Dynamic Programming: Health Care Applications

Abstract

This dissertation considers different approximate solutions to Markov decision problems formulated within the dynamic programming framework in two health care applications. Dynamic formulations are appropriate for problems which require optimization over time and a variety of settings for different scenarios and policies. This is similar to the situation in a lot of health care applications for which because of the curses of dimensionality, exact solutions do not always exist. Thus, approximate analysis to find near optimal solutions are motivated. To check the quality of approximation, additional evidence such as boundaries, consistency analysis, or asymptotic behavior evaluation are required. Emergency vehicle management and dose-finding clinical trials are the two heath care applications considered here in order to investigate dynamic formulations, approximate solutions, and solution quality assessments. The dynamic programming formulation for real-time ambulance dispatching and relocation policies, response-adaptive dose-finding clinical trial, and optimal stopping of adaptive clinical trials is presented. Approximate solutions are derived by multiple methods such as basis function regression, one-step look-ahead policy, simulation-based gridding algorithm, and diffusion approximation. Finally, some boundaries to assess the optimality gap and a proof of consistency for approximate solutions are presented to ensure the quality of approximation

    Similar works