Journal
CMC-COMPUTERS MATERIALS & CONTINUA
Volume 71, Issue 2, Pages 2225-2247Publisher
TECH SCIENCE PRESS
DOI: 10.32604/cmc.2022.022952
Keywords
Artificial intelligence; traffic light control; traffic disruptions; multi-agent deep Q-network; deep reinforcement learning
Funding
- Research Creativity and Management Office, Universiti Sains Malaysia
Ask authors/readers for more resources
This paper investigates the use of multi-agent deep Q-network (MADQN) to address the curse of dimensionality issue in traditional multi-agent reinforcement learning (MARL), and conducts case studies on real traffic networks and grid traffic networks. The results show that the MADQN scheme has a significant effect on traffic signal control.
This paper investigates the use of multi-agent deep Q-network (MADQN) to address the curse of dimensionality issue occurred in the traditional multi-agent reinforcement learning (MARL) approach. The proposed MADQN is applied to traffic light controllers at multiple intersections with busy traffic and traffic disruptions, particularly rainfall. MADQN is based on deep Q-network (DQN), which is an integration of the traditional reinforcement learning (RL) and the newly emerging deep learning (DL) approaches. MADQN enables traffic light controllers to learn, exchange knowledge with neighboring agents, and select optimal joint actions in a collaborative manner. A case study based on a real traffic network is conducted as part of a sustainable urban city project in the Sunway City of Kuala Lumpur in Malaysia. Investigation is also performed using a grid traffic network (GTN) to understand that the proposed scheme is effective in a traditional traffic network. Our proposed scheme is evaluated using two simulation tools, namely Matlab and Simulation of Urban Mobility (SUMO). Our proposed scheme has shown that the cumulative delay of vehicles can be reduced by up to 30% in the simulations.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available