Journal
APPLIED MATHEMATICAL MODELLING
Volume 97, Issue -, Pages 226-243Publisher
ELSEVIER SCIENCE INC
DOI: 10.1016/j.apm.2021.03.048
Keywords
Stochastic resource-constrained project scheduling; Uncertain resource availability; Markov decision process; Approximate dynamic programming; Rollout policy
Funding
- Humanities and Social Sciences Foundation of the Ministry of Education of China [17YJC630177]
- National Science Foundation of China [71571005, 71602106]
- Natural Science Foundation of Shandong Province of China [ZR2018BG003]
- Science and Technology Research Program for Higher Education of Shandong Province of China [2019KJI006]
Ask authors/readers for more resources
The study focuses on the stochastic resource-constrained project scheduling problem with uncertain resource availability, proposing a new MDP model and an ADP algorithm to handle insufficient resource capacity, with proved theoretical sequential improvement property.
We study the stochastic resource-constrained project scheduling problem with uncertain resource availability, called SRCPSP-URA, and model it as a sequential decision problem. A new Markov decision process (MDP) model is developed for the SRCPSP-URA. It dynamically and adaptively determines not only which activity to start at a stage, but also which to interrupt and delay when there is not sufficient resource capacity. To tackle the curse-of-dimensionality of an exact solution approach, we devise and implement a rollout-based approximate dynamic programming (ADP) algorithm with priority-rule heuristic as the base policy, for which theoretical sequential improvement property is proved. Computational results show that with moderately more computational time, our ADP algorithm significantly outperforms the priority-rule heuristics for test instances up to 120 activities. (C) 2021 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available