1,646 research outputs found

    Model migration neural network for predicting battery aging trajectories

    Get PDF
    Accurate prediction of batteries’ future degradation is a key solution to relief users’ anxiety on battery lifespan and electric vehicle’s driving range. Technical challenges arise from the highly nonlinear dynamics of battery aging. In this paper, a feed-forward migration neural network is proposed to predict the batteries’ aging trajectories. Specifically, a base model that describes the capacity decay over time is first established from the existed battery aging dataset. This base model is then transformed by an input-output slope-and-bias-correction (SBC) method structure to capture the degradation of target cell. To enhance the model’s nonlinear transfer capability, the SBC-model is further integrated into a four-layer neural network, and easily trained via the gradient correlation algorithm. The proposed migration neural network is experimentally verified with four different commercial batteries. The predicted RMSEs are all lower than 2.5% when using only the first 30% of aging trajectories for neural network training. In addition, illustrative results demonstrate that a small size feed-forward neural network (down to 1-5-5-1) is sufficient for battery aging trajectory prediction

    Hybrid Neural Networks for Enhanced Predictions of Remaining Useful Life in Lithium-Ion Batteries

    Get PDF
    With the proliferation of electric vehicles (EVs) and the consequential increase in EV battery circulation, the need for accurate assessments of battery health and remaining useful life (RUL) is paramount, driven by environmentally friendly and sustainable goals. This study addresses this pressing concern by employing data-driven methods, specifically harnessing deep learning techniques to enhance RUL estimation for lithium-ion batteries (LIB). Leveraging the Toyota Research Institute Dataset, consisting of 124 lithium-ion batteries cycled to failure and encompassing key metrics such as capacity, temperature, resistance, and discharge time, our analysis substantially improves RUL prediction accuracy. Notably, the convolutional long short-term memory deep neural network (CLDNN) model and the transformer LSTM (temporal transformer) model have emerged as standout remaining useful life (RUL) predictors. The CLDNN model, in particular, achieved a remarkable mean absolute error (MAE) of 84.012 and a mean absolute percentage error (MAPE) of 25.676. Similarly, the temporal transformer model exhibited a notable performance, with an MAE of 85.134 and a MAPE of 28.7932. These impressive results were achieved by applying Bayesian hyperparameter optimization, further enhancing the accuracy of predictive methods. These models were bench-marked against existing approaches, demonstrating superior results with an improvement in MAPE ranging from 4.01% to 7.12%

    A critical review of online battery remaining useful lifetime prediction methods.

    Get PDF
    Lithium-ion batteries play an important role in our daily lives. The prediction of the remaining service life of lithium-ion batteries has become an important issue. This article reviews the methods for predicting the remaining service life of lithium-ion batteries from three aspects: machine learning, adaptive filtering, and random processes. The purpose of this study is to review, classify and compare different methods proposed in the literature to predict the remaining service life of lithium-ion batteries. This article first summarizes and classifies various methods for predicting the remaining service life of lithium-ion batteries that have been proposed in recent years. On this basis, by selecting specific criteria to evaluate and compare the accuracy of different models, find the most suitable method. Finally, summarize the development of various methods. According to the research in this article, the average accuracy of machine learning is 32.02% higher than the average of the other two methods, and the prediction cycle is 9.87% shorter than the average of the other two methods

    Online Lifetime Prediction for Lithium-Ion Batteries with Cycle-by-Cycle Updates, Variance Reduction, and Model Ensembling

    Get PDF
    This project was funded by an industry-academia grant EPSRC EP/R511687/1 awarded by EPSRC & University of Edinburgh program Impact Acceleration Account (IAA). R. Ibraheem is a Ph.D. student in EPSRC’s MAC-MIGS Centre for Doctoral Training. MAC-MIGS is supported by the UK’s Engineering and Physical Science Research Council (grant number EP/S023291/1). G. dos Reis acknowledges support from the Faraday Institution [grant number FIRG049]. Publisher Copyright: © 2023 by the authors.Lithium-ion batteries have found applications in many parts of our daily lives. Predicting their remaining useful life (RUL) is thus essential for management and prognostics. Most approaches look at early life prediction of RUL in the context of designing charging profiles or optimising cell design. While critical, said approaches are not directly applicable to the regular testing of cells used in applications. This article focuses on a class of models called ‘one-cycle’ models which are suitable for this task and characterized by versatility (in terms of online prediction frameworks and model combinations), prediction from limited input, and cells’ history independence. Our contribution is fourfold. First, we show the wider deployability of the so-called one-cycle model for a different type of battery data, thus confirming its wider scope of use. Second, reflecting on how prediction models can be leveraged within battery management cloud solutions, we propose a universal Exponential-smoothing (e-forgetting) mechanism that leverages cycle-to-cycle prediction updates to reduce prediction variance. Third, we use this new model as a second-life assessment tool by proposing a knee region classifier. Last, using model ensembling, we build a “model of models”. We show that it outperforms each underpinning model (from in-cycle variability, cycle-to-cycle variability, and empirical models). This ‘ensembling’ strategy allows coupling explainable and black-box methods, thus giving the user extra control over the final model.publishersversionpublishe

    Model-free non-invasive health assessment for battery energy storage assets

    Get PDF
    Increasing penetration of renewable energy generation in the modern power network introduces uncertainty about the energy available to maintain a balance between generation and demand due to its time-fluctuating output that is strongly dependent on the weather. With the development of energy storage technology, there is the potential for this technology to become a key element to help overcome this intermittency in a generation. However, the increasing penetration of battery energy storage within the power network introduces an additional challenge to asset owners on how to monitor and manage battery health. The accurate estimation of the health of this device is crucial in determining its reliability, power-delivering capability and ability to contribute to the operation of the whole power system. Generally, doing this requires invasive measurements or computationally expensive physics-based models, which do not scale up cost-effectively to a fleet of assets. As storage aggregation becomes more commonplace, there is a need for a health metric that will be able to predict battery health based only on the limited information available, eliminating the necessity of installation of extensive telemetry in the system. This work develops a solution to battery health prognostics by providing an alternative, a non-invasive approach to the estimation of battery health that estimates the extent to which a battery asset has been maloperated based only on the battery-operating regime imposed on the device. The model introduced in this work is based on the Hidden Markov Model, which stochastically models the battery limitations imposed by its chemistry as a combination of present and previous sequential charging actions, and articulates the preferred operating regime as a measure of health consequence. The resulting methodology is demonstrated on distribution network level electrical demand and generation data, accurately predicting maloperation under a number of battery technology scenarios. The effectiveness of the proposed battery maloperation model as a proxy for actual battery degradation for lithium-ion technology was also tested against lab tested battery degradation data, showing that the proposed health measure in terms of maloperation level reflected that measured in terms of capacity fade. The developed model can support condition monitoring and remaining useful life estimates, but in the wider context could also be used as the policy function in an automated scheduler to utilise assets while optimising their health.Increasing penetration of renewable energy generation in the modern power network introduces uncertainty about the energy available to maintain a balance between generation and demand due to its time-fluctuating output that is strongly dependent on the weather. With the development of energy storage technology, there is the potential for this technology to become a key element to help overcome this intermittency in a generation. However, the increasing penetration of battery energy storage within the power network introduces an additional challenge to asset owners on how to monitor and manage battery health. The accurate estimation of the health of this device is crucial in determining its reliability, power-delivering capability and ability to contribute to the operation of the whole power system. Generally, doing this requires invasive measurements or computationally expensive physics-based models, which do not scale up cost-effectively to a fleet of assets. As storage aggregation becomes more commonplace, there is a need for a health metric that will be able to predict battery health based only on the limited information available, eliminating the necessity of installation of extensive telemetry in the system. This work develops a solution to battery health prognostics by providing an alternative, a non-invasive approach to the estimation of battery health that estimates the extent to which a battery asset has been maloperated based only on the battery-operating regime imposed on the device. The model introduced in this work is based on the Hidden Markov Model, which stochastically models the battery limitations imposed by its chemistry as a combination of present and previous sequential charging actions, and articulates the preferred operating regime as a measure of health consequence. The resulting methodology is demonstrated on distribution network level electrical demand and generation data, accurately predicting maloperation under a number of battery technology scenarios. The effectiveness of the proposed battery maloperation model as a proxy for actual battery degradation for lithium-ion technology was also tested against lab tested battery degradation data, showing that the proposed health measure in terms of maloperation level reflected that measured in terms of capacity fade. The developed model can support condition monitoring and remaining useful life estimates, but in the wider context could also be used as the policy function in an automated scheduler to utilise assets while optimising their health

    Machine Learning in Lithium-Ion Battery:Applications, Challenges, and Future Trends

    Get PDF
    Machine Learning has garnered significant attention in lithium-ion battery research for its potential to revolutionize various aspects of the field. This paper explores the practical applications, challenges, and emerging trends of employing Machine Learning in lithium-ion battery research. Delves into specific Machine Learning techniques and their relevance, offering insights into their transformative potential. The applications of Machine Learning in lithium-ion-battery design, manufacturing, service, and end-of-life are discussed. The challenges including data availability, data preprocessing and cleaning challenges, limited sample size, computational complexity, model generalization, black-box nature of Machine Learning models, scalability of the algorithms for large datasets, data bias, and interdisciplinary nature and their mitigations are also discussed. Accordingly, by discussing the future trends, it provides valuable insights for researchers in this field. For example, a future trend is to address the challenge of small datasets by techniques such as Transfer Learning and N-shot Learning. This paper not only contributes to our understanding of Machine Learning applications but also empowers professionals in this field to harness its capabilities effectively.</p

    A critical review of improved deep learning methods for the remaining useful life prediction of lithium-ion batteries.

    Get PDF
    As widely used for secondary energy storage, lithium-ion batteries have become the core component of the power supply system and accurate remaining useful life prediction is the key to ensure its reliability. Because of the complex working characteristics of lithium-ion batteries as well as the model parameter changing along with the aging process, the accuracy of the online remaining useful life prediction is difficult but urgent to be improved for the reliable power supply application. The deep learning algorithm improves the accuracy of the remaining useful life prediction, which also reduces the characteristic testing time requirement, providing the possibility to improve the power profitability of predictive energy management. This article analyzes, reviews, classifies, and compares different adaptive mathematical models on deep learning algorithms for the remaining useful life prediction. The features are identified for the modeling ability, according to which the adaptive prediction methods are classified. The specific criteria are defined to evaluate different modeling accuracy in the deep learning calculation procedure. The key features of effective life prediction are used to draw relevant conclusions and suggestions are provided, in which the high-accuracy deep convolutional neural network — extreme learning machine algorithm is chosen to be utilized for the stable remaining useful life prediction of lithium-ion batteries

    Machine Learning in Lithium-Ion Battery: Applications, Challenges, and Future Trends

    Get PDF
    Machine Learning has garnered significant attention in lithium-ion battery research for its potential to revolutionize various aspects of the field. This paper explores the practical applications, challenges, and emerging trends of employing Machine Learning in lithium-ion battery research. Delves into specific Machine Learning techniques and their relevance, offering insights into their transformative potential. The applications of Machine Learning in lithium-ion-battery design, manufacturing, service, and end-of-life are discussed. The challenges including data availability, data preprocessing and cleaning challenges, limited sample size, computational complexity, model generalization, black-box nature of Machine Learning models, scalability of the algorithms for large datasets, data bias, and interdisciplinary nature and their mitigations are also discussed. Accordingly, by discussing the future trends, it provides valuable insights for researchers in this field. For example, a future trend is to address the challenge of small datasets by techniques such as Transfer Learning and N-shot Learning. This paper not only contributes to our understanding of Machine Learning applications but also empowers professionals in this field to harness its capabilities effectively
    • 

    corecore