3,681 research outputs found

    Cost-minimization predictive energy management of a postal-delivery fuel cell electric vehicle with intelligent battery State-of-Charge Planner

    Full text link
    Fuel cell electric vehicles have earned substantial attentions in recent decades due to their high-efficiency and zero-emission features, while the high operating costs remain the major barrier towards their large-scale commercialization. In such context, this paper aims to devise an energy management strategy for an urban postal-delivery fuel cell electric vehicle for operating cost mitigation. First, a data-driven dual-loop spatial-domain battery state-of-charge reference estimator is designed to guide battery energy depletion, which is trained by real-world driving data collected in postal delivery missions. Then, a fuzzy C-means clustering enhanced Markov speed predictor is constructed to project the upcoming velocity. Lastly, combining the state-of-charge reference and the forecasted speed, a model predictive control-based cost-optimization energy management strategy is established to mitigate vehicle operating costs imposed by energy consumption and power-source degradations. Validation results have shown that 1) the proposed strategy could mitigate the operating cost by 4.43% and 7.30% in average versus benchmark strategies, denoting its superiority in term of cost-reduction and 2) the computation burden per step of the proposed strategy is averaged at 0.123ms, less than the sampling time interval 1s, proving its potential of real-time applications

    Stochastic model predictive control for energy management of power-split plug-in hybrid electric vehicles based on reinforcement learning

    Get PDF
    In this paper, a stochastic model predictive control (MPC) method based on reinforcement learning is proposed for energy management of plug-in hybrid electric vehicles (PHEVs). Firstly, the power transfer of each component in a power-split PHEV is described in detail. Then an effective and convergent reinforcement learning controller is trained by the Q-learning algorithm according to the driving power distribution under multiple driving cycles. By constructing a multi-step Markov velocity prediction model, the reinforcement learning controller is embedded into the stochastic MPC controller to determine the optimal battery power in predicted time domain. Numerical simulation results verify that the proposed method achieves superior fuel economy that is close to that by stochastic dynamic programming method. In addition, the effective state of charge tracking in terms of different reference trajectories highlight that the proposed method is effective for online application requiring a fast calculation speed

    Integrated Thermal and Energy Management of Connected Hybrid Electric Vehicles Using Deep Reinforcement Learning

    Get PDF
    The climate-adaptive energy management system holds promising potential for harnessing the concealed energy-saving capabilities of connected plug-in hybrid electric vehicles. This research focuses on exploring the synergistic effects of artificial intelligence control and traffic preview to enhance the performance of the energy management system (EMS). A high-fidelity model of a multi-mode connected PHEV is calibrated using experimental data as a foundation. Subsequently, a model-free multistate deep reinforcement learning (DRL) algorithm is proposed to develop the integrated thermal and energy management (ITEM) system, incorporating features of engine smart warm-up and engine-assisted heating for cold climate conditions. The optimality and adaptability of the proposed system is evaluated through both offline tests and online hardware-in-the-loop tests, encompassing a homologation driving cycle and a real-world driving cycle in China with real-time traffic data. The results demonstrate that ITEM achieves a close to dynamic programming fuel economy performance with a margin of 93.7%, while reducing fuel consumption ranging from 2.2% to 9.6% as ambient temperature decreases from 15°C to -15°C in comparison to state-of-the-art DRL-based EMS solutions

    Bi-directional coordination of plug-in electric vehicles with economic model predictive control

    Get PDF
    © 2017 by the authors. Licensee MDPI, Basel, Switzerland. The emergence of plug-in electric vehicles (PEVs) is unveiling new opportunities to de-carbonise the vehicle parcs and promote sustainability in different parts of the globe. As battery technologies and PEV efficiency continue to improve, the use of electric cars as distributed energy resources is fast becoming a reality. While the distribution network operators (DNOs) strive to ensure grid balancing and reliability, the PEV owners primarily aim at maximising their economic benefits. However, given that the PEV batteries have limited capacities and the distribution network is constrained, smart techniques are required to coordinate the charging/discharging of the PEVs. Using the economic model predictive control (EMPC) technique, this paper proposes a decentralised optimisation algorithm for PEVs during the grid-To-vehicle (G2V) and vehicle-To-grid (V2G) operations. To capture the operational dynamics of the batteries, it considers the state-of-charge (SoC) at a given time as a discrete state space and investigates PEVs performance in V2G and G2V operations. In particular, this study exploits the variability in the energy tariff across different periods of the day to schedule V2G/G2V cycles using real data from the university's PEV infrastructure. The results show that by charging/discharging the vehicles during optimal time partitions, prosumers can take advantage of the price elasticity of supply to achieve net savings of about 63%

    A Novel Learning Based Model Predictive Control Strategy for Plug-in Hybrid Electric Vehicle

    Get PDF
    The multi-source electromechanical coupling renders energy management of plug-in hybrid electric vehicles (PHEVs) highly nonlinear and complex. Furthermore, the complicated nonlinear management process highly depends on knowledge of driving conditions, and hinders the control strategies efficiently applied instantaneously, leading to massive challenges in energy saving improvement of PHEVs. To address these issues, a novel learning based model predictive control (LMPC) strategy is developed for a serial-parallel PHEV with the reinforced optimal control effect in real time application. Rather than employing the velocity-prediction based MPC methods favored in the literature, an original reference-tracking based MPC solution is proposed with strong instant application capacity. To guarantee the optimal control effect, an online learning process is implemented in MPC via the Gaussian process (GP) model to address the uncertainties during state estimation. The tracking reference in LMPC based control problem in PHEV is achieved by a microscopic traffic flow analysis (MTFA) method. The simulation results validate that the proposed method can optimally manage energy flow within vehicle power sources in real time, highlighting its anticipated preferable performance
    • …
    corecore