5,450 research outputs found

    Neural Network-Based Li-Ion Battery Aging Model at Accelerated C-Rate

    Get PDF
    Lithium-ion (Li-ion) batteries are widely used in electric vehicles (EVs) because of their high energy density, low self-discharge, and superior performance. Despite this, Li-ion batteries’ performance and reliability become critical as they lose their capacity with increasing charge and discharging cycles. Moreover, Li-ion batteries are subject to aging in EVs due to load variations in discharge. Monitoring the battery cycle life at various discharge rates would enable the battery management system (BMS) to implement control parameters to resolve the aging issue. In this paper, a battery lifetime degradation model is proposed at an accelerated current rate (C-rate). Furthermore, an ideal lifetime discharge rate within the standard C-rate and beyond the C-rate is proposed. The consequence of discharging at an accelerated C-rate on the cycle life of the batteries is thoroughly investigated. Moreover, the battery degradation model is investigated with a deep learning algorithm-based feed-forward neural network (FNN), and a recurrent neural network (RNN) with long short-term memory (LSTM) layer. A comparative assessment of performance of the developed models is carried out and it is shown that the LSTM-RNN battery aging model has superior performance at accelerated C-rate compared to the traditional FNN network

    A critical review of improved deep learning methods for the remaining useful life prediction of lithium-ion batteries.

    Get PDF
    As widely used for secondary energy storage, lithium-ion batteries have become the core component of the power supply system and accurate remaining useful life prediction is the key to ensure its reliability. Because of the complex working characteristics of lithium-ion batteries as well as the model parameter changing along with the aging process, the accuracy of the online remaining useful life prediction is difficult but urgent to be improved for the reliable power supply application. The deep learning algorithm improves the accuracy of the remaining useful life prediction, which also reduces the characteristic testing time requirement, providing the possibility to improve the power profitability of predictive energy management. This article analyzes, reviews, classifies, and compares different adaptive mathematical models on deep learning algorithms for the remaining useful life prediction. The features are identified for the modeling ability, according to which the adaptive prediction methods are classified. The specific criteria are defined to evaluate different modeling accuracy in the deep learning calculation procedure. The key features of effective life prediction are used to draw relevant conclusions and suggestions are provided, in which the high-accuracy deep convolutional neural network — extreme learning machine algorithm is chosen to be utilized for the stable remaining useful life prediction of lithium-ion batteries

    Data-Driven Methods for the State of Charge Estimation of Lithium-Ion Batteries: An Overview

    Get PDF
    In recent years, there has been a noticeable shift towards electric mobility and an increasing emphasis on integrating renewable energy sources. Consequently, batteries and their management have been prominent in this context. A vital aspect of the BMS revolves around accurately determining the battery pack’s SOC. Notably, the advent of advanced microcontrollers and the availability of extensive datasets have contributed to the growing popularity and practicality of data-driven methodologies. This study examines the developments in SOC estimation over the past half-decade, explicitly focusing on data-driven estimation techniques. It comprehensively assesses the performance of each algorithm, considering the type of battery and various operational conditions. Additionally, intricate details concerning the models’ hyperparameters, including the number of layers, type of optimiser, and neuron, are provided for thorough examination. Most of the models analysed in the paper demonstrate strong performance, with both the MAE and RMSE for the estimation of SOC hovering around 2% or even lower

    Machine learning approach to investigate EV battery characteristics

    Get PDF
    The main factor influencing an electric vehicle’s range is its battery. Battery electric vehicles experience driving range reduction in low temperatures. This range reduction results from the heating demand for the cabin and recuperation limits by the braking system. Due to the lack of an internal combustion engine-style heat source, electric vehicles\u27 heating system demands a significant amount of energy. This energy is supplied by the battery and results in driving range reduction. Moreover, Due to the battery\u27s low temperature in cold weather, the charging process through recuperation is limited. This limitation of recuperation is caused by the low reaction rate in low temperatures. Technology developments for battery electric vehicles are mostly focused on maintaining the vehicle battery package temperature and state of charge. For battery management systems, state of charge and battery temperature estimations are important since they prevent over charge, over discharge, and thermal runaway. Estimation and controlling battery temperature and the state of charge guarantees safety, it will also increase the vehicle\u27s life cycle. This study analyzes the effects of ambient and battery temperature on heating system energy demand and regenerative braking parameters. Moreover, different machine learning methods for estimating the battery temperature and its state of charge are compared and presented. The analysis is based on the BMW i3 winter trips dataset which includes data for 38 different drive cycles. Results show that every 3 degrees of ambient temperature drop results in a 1% increase in the heating energy share. Furthermore, the ability of machine learning methods such as LSTM and GRU has been demonstrated to successfully forecast battery temperature and state of charge

    Reducing the computational cost for artificial intelligence-based battery state-of-health estimation in charging events

    Get PDF
    Powertrain electrification is bound to pave the way for the decarbonization process and pollutant emission reduction of the automotive sector, and strong attention should hence be devoted to the electrical energy storage system. Within such a framework, the lithium-ion battery plays a key role in the energy scenario, and the reduction of lifetime due to the cell degradation during its usage is bound to be a topical challenge. The aim of this work is to estimate the state of health (SOH) of lithium-ion battery cells with satisfactory accuracy and low computational cost. This would allow the battery management system (BMS) to guarantee optimal operation and extended cell lifetime. Artificial intelligence (AI) algorithms proved to be a promising data-driven modelling technique for the cell SOH prediction due to their great suitability and low computational demand. An accurate on-board SOH estimation is achieved through the identification of an optimal SOC window within the cell charging process. Several Bi-LSTM networks have been trained through a random-search algorithm exploiting constant current constant voltage (CCCV) test protocol data. Different analyses have been performed and evaluated as a trade-off between prediction performance (in terms of RMSE and customized accuracy) and computational burden (in terms of memory usage and elapsing time). Results reveal that the battery state of health can be predicted by a single-layer Bi-LSTM network with an error of 0.4% while just monitoring 40% of the entire charging process related to 60–100% SOC window, corresponding to the constant-voltage (CV) phase. Finally, results show that the amount of memory used for data logging and processing time has been cut by a factor of approximately 2.3

    A critical review of online battery remaining useful lifetime prediction methods.

    Get PDF
    Lithium-ion batteries play an important role in our daily lives. The prediction of the remaining service life of lithium-ion batteries has become an important issue. This article reviews the methods for predicting the remaining service life of lithium-ion batteries from three aspects: machine learning, adaptive filtering, and random processes. The purpose of this study is to review, classify and compare different methods proposed in the literature to predict the remaining service life of lithium-ion batteries. This article first summarizes and classifies various methods for predicting the remaining service life of lithium-ion batteries that have been proposed in recent years. On this basis, by selecting specific criteria to evaluate and compare the accuracy of different models, find the most suitable method. Finally, summarize the development of various methods. According to the research in this article, the average accuracy of machine learning is 32.02% higher than the average of the other two methods, and the prediction cycle is 9.87% shorter than the average of the other two methods

    Artificial Intelligence Opportunities to Diagnose Degradation Modes for Safety Operation in Lithium Batteries

    Get PDF
    The degradation and safety study of lithium-ion batteries is becoming increasingly important given that these batteries are widely used not only in electronic devices but also in automotive vehicles. Consequently, the detection of degradation modes that could lead to safety alerts is essential. Existing methodologies are diverse, experimental based, model based, and the new trends of artificial intelligence. This review aims to analyze the existing methodologies and compare them, opening the spectrum to those based on artificial intelligence (AI). AI-based studies are increasing in number and have a wide variety of applications, but no classification, in-depth analysis, or comparison with existing methodologies is yet available
    • …
    corecore