分会场
控制与测试
摘要
This paper studied ‘model-free’ supervisory engine control for a heavy-duty plug-in hybrid electric vehicle (PHEV), where an artificial intelligence (AI) agent is adopted to determine the engine power demand based on the observation of PHEV’s power demand and the battery state-of-charge (SoC). A new predictive double Q-learning (PDQL) scheme is proposed to train the AI agent for continuously improving vehicle’s energy efficiency. The PDQL distinguishes from the standard double Q learning (SDQL) because it adopts a new mechanism that uses a backup models (Q tables) to enable robust reward prediction instead of directly using the real-world measurement. In every step time, the agent randomly selects a Q table (A or B) for action execution (engine control) and uses the other as the backup model. Both Q tables will be updated using the quantified value of PHEV performance (i.e., a cost function of energy consumption and remaining battery SoC) that is predicted by the backup model. This new mechanism can prevent bias in real-world measurement and help speed up the learning convergence. By using the SDQL as the baseline, experimental evaluations are conducted on Software-in-the-loop (SiL) and Hardware-in-the-loop (HiL) platforms. The results indicate that the PDQL leads to faster convergence which can exceed the energy efficiency 3.34% higher than SDQL after 7th of learning rounds. And the PDQL can still improve the energy efficiency more than 1.67% higher energy efficiency compared to the SDQL after 35 learning rounds.
关键词
Hybrid electric vehicle; Supervisory engine control; Predictive double Q-learning; Backup model.
电子U盘全文仅限大会已缴费参会代表下载。
您还没有登录,请您先 点击这里登录
第二届世界内燃机大会
The 2nd World Congress on Internal Combustion Engines