4.7 Article

Markov decision processes with burstiness constraints

Journal

EUROPEAN JOURNAL OF OPERATIONAL RESEARCH
Volume 312, Issue 3, Pages 877-889

Publisher

ELSEVIER
DOI: 10.1016/j.ejor.2023.07.045

Keywords

Dynamic programming; Constrained Markov decision processes; Burstiness constraints

Ask authors/readers for more resources

This paper discusses a Markov Decision Process (MDP) model considering (sigma, rho)-burstiness constraints over a finite or infinite horizon, and explores the corresponding constrained optimization problems. By introducing a recursive form of constraints, an augmented-state model is proposed to recover sufficiency of Markov or stationary policies and apply standard theory.
We consider a Markov Decision Process (MDP), over a finite or infinite horizon, augmented by so-called (sigma, rho)-burstiness constraints. Such constraints, which had been introduced within the framework of net-work calculus, are meant to limit some additive quantity to a given rate over any time interval, plus a term which allows for occasional and limited bursts. We introduce this class of constraints for MDP models, and formulate the corresponding constrained optimization problems. Due to the burstiness constraints, constrained optimal policies are generally history-dependent. We use a recursive form of the constraints to define an augmented-state model, for which sufficiency of Markov or stationary policies is recovered and the standard theory may be applied, albeit over a larger state space. The analysis is mainly devoted to a characterization of feasible policies, followed by application to the constrained MDP optimization problem. A simple queuing example serves to illustrate some of the concepts and calculations involved. (c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available