Journal
HEALTH CARE MANAGEMENT SCIENCE
Volume 21, Issue 4, Pages 604-631Publisher
SPRINGER
DOI: 10.1007/s10729-017-9415-5
Keywords
Medical decision making; Markov decision processes; Hepatitis C virus; Optimal stopping; Dynamic programming
Categories
Ask authors/readers for more resources
We develop a general framework for optimal health policy design in a dynamic setting. We consider a hypothetical medical intervention for a cohort of patients where one parameter varies across cohorts with imperfectly observable linear dynamics. We seek to identify the optimal time to change the current health intervention policy and the optimal time to collect decision-relevant information. We formulate this problem as a discrete-time, infinite-horizon Markov decision process and we establish structural properties in terms of first and second-order monotonicity. We demonstrate that it is generally optimal to delay information acquisition until an effect on decisions is sufficiently likely. We apply this framework to the evaluation of hepatitis C virus (HCV) screening in the general population determining which birth cohorts to screen for HCV and when to collect information about HCV prevalence.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available