Semimonthly

ISSN 1000-1026

CN 32-1180/TP

+Advanced Search 中文版
Charging Path Planning for Electric Vehicles Based on Reinforcement Learning Environment Design Strategy
Author:
Affiliation:

1.School of Automation, Southeast University, Nanjing 210096, China;2.Key Laboratory of Measurement and Control of Complex System of Engineering,Southeast University, Nanjing 210096, China;3.School of Electrical Engineering, Southeast University, Nanjing 210096, China

Abstract:

An environmental modeling method suitable for reinforcement learning is proposed for the charging path planning problem of electric vehicles. Based on the actual situation of urban road network and geographical distribution of charging stations, this method divides the basic driving path of electric vehicles into three segments for representation. Based on the three-segment expression method, the design scheme of state space, action space, state transition, and reward function is proposed. The charging path planning is modeled as a Markov decision process, and solved by the Q learning method and the deep Q network (DQN) method. The experimental results show that the design scheme of the reinforcement learning environment based on the three-segment expression method is solvable and portable. It takes into account the realistic scenarios such as the deceleration and turning of electric vehicles in the process of driving from the road to the charging station, and simplifies the charging action into a driving direction choice, which improves the efficiency of the reinforcement learning algorithm based on Q learning and DQN.

Keywords:

Foundation:

This work is supported by National Key R&D Program of China (No. 2021YFB2501600).

Get Citation
[1]SONG Yuhang, CHEN Yufan, WEI Yanling, et al. Charging Path Planning for Electric Vehicles Based on Reinforcement Learning Environment Design Strategy[J]. Automation of Electric Power Systems,2024,48(11):184-196. DOI:10.7500/AEPS20230621004
Copy
Share
History
  • Received:June 21,2023
  • Revised:November 16,2023
  • Adopted:November 20,2023
  • Online: May 31,2024
  • Published: