4.7 Article

Tutorial on Stochastic Optimization in Energy-Part I: Modeling and Policies

Journal

IEEE TRANSACTIONS ON POWER SYSTEMS
Volume 31, Issue 2, Pages 1459-1467

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TPWRS.2015.2424974

Keywords

Approximate dynamic programming; dynamic programming; energy systems; optimal control; reinforcement learning; robust optimization; stochastic optimization; stochastic programming

Funding

  1. National Science Foundation [ECCS-1127975]
  2. SAP initiative for energy systems research
  3. German Research Foundation

Ask authors/readers for more resources

There is a wide range of problems in energy systems that require making decisions in the presence of different forms of uncertainty. The fields that address sequential, stochastic decision problems lack a standard canonical modeling framework, with fragmented, competing solution strategies. Recognizing that we will never agree on a single notational system, this two-part tutorial proposes a simple, straightforward canonical model (that is most familiar to people with a control theory background), and introduces four fundamental classes of policies which integrate the competing strategies that have been proposed under names such as control theory, dynamic programming, stochastic programming and robust optimization. Part II of the tutorial illustrates the modeling framework using a simple energy storage problem, where we show that, depending on the problem characteristics, each of the four classes of policies may be best.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available