Issue |
E3S Web Conf.
Volume 229, 2021
The 3rd International Conference of Computer Science and Renewable Energies (ICCSRE’2020)
|
|
---|---|---|
Article Number | 01047 | |
Number of page(s) | 11 | |
DOI | https://doi.org/10.1051/e3sconf/202122901047 | |
Published online | 25 January 2021 |
Discounted Markov Decision Processes with Constrained Costs: the decomposition approach
1
Laboratory TIAD, Faculty of Sciences and Techniques, Sultan Moulay Slimane University, Campus Mghilla, Beni Mellal, Morocco
2
Laboratory of Computer Sciences, Faculty of Sciences Kenitra, IbnTofail University, Morocco
* Corresponding author: abd_semmouri@yahoo.fr
In this paper we consider a constrained optimization of discrete time Markov Decision Processes (MDPs) with finite state and action spaces, which accumulate both a reward and costs at each decision epoch. We will study the problem of finding a policy that maximizes the expected total discounted reward subject to the constraints that the expected total discounted costs are not greater than given values. Thus, we will investigate the decomposition method of the state space into the strongly communicating classes for computing an optimal or a nearly optimal stationary policy. The discounted criterion has many applications in several areas such that the Forest Management, the Management of Energy Consumption, the finance, the Communication System (Mobile Networks) and the artificial intelligence.
© The Authors, published by EDP Sciences, 2021
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.