13 research outputs found

    A new consideration for validating battery performance at low ambient temperatures

    Get PDF
    Existing validation methods for equivalent circuit models (ECMs) do not capture the effects of operating lithium-ion cells over legislative drive cycles at low ambient temperatures. Unrealistic validation of an ECM may often lead to reduced accuracy in electric vehicle range estimation. In this study, current and power are used to illustrate the different approaches for validating ECMs when operating at low ambient temperatures (−15 °C to 25 °C). It was found that employing a current-based approach leads to under-testing of the performance of lithium-ion cells for various legislative drive cycles (NEDC; FTP75; US06; WLTP-3) compared to the actual vehicle. In terms of energy demands, this can be as much as ~21% for more aggressive drive cycles but even ~15% for more conservative drive cycles. In terms of peak power demands, this can range from ~27% for more conservative drive cycles to ~35% for more aggressive drive cycles. The research findings reported in this paper suggest that it is better to use a power-based approach (with dynamic voltage) rather than a current-based approach (with fixed voltage) to characterise and model the performance of lithium-ion cells for automotive applications, especially at low ambient temperatures. This evidence should help rationalize the approaches in a model-based design process leading to potential improvements in real-world applications for lithium-ion cell

    Electric vehicle battery performance investigation based on real world current harmonics

    Get PDF
    Electric vehicle (EV) powertrains consist of power electronic components as well as electric machines to manage the energy flow between different powertrain subsystems and to deliver the necessary torque and power requirements at the wheels. These power subsystems can generate undesired electrical harmonics on the direct current (DC) bus of the powertrain. This may lead to the on-board battery being subjected to DC current superposed with undesirable high- and low- frequency current oscillations, known as ripples. From real-world measurements, significant current harmonics perturbations within the range of 50 Hz to 4 kHz have been observed on the high voltage DC bus of the EV. In the limited literature, investigations into the impact of these harmonics on the degradation of battery systems have been conducted. In these studies, the battery systems were supplied by superposed current signals i.e., DC superposed by a single frequency alternating current (AC). None of these studies considered applying the entire spectrum of the ripple current measured in the real-world scenario, which is focused on in this research. The preliminary results indicate that there is no difference concerning capacity fade or impedance rise between the cells subjected to just DC current and those subjected additionally to a superposed AC ripple current

    Improving accessible capacity tracking at low ambient temperatures for range estimation of battery electric vehicles

    Get PDF
    Today’s market leading electric vehicles, driven on typical UK motorways, have real-world range estimation inaccuracy of up to 27%, at around 10 °C outside temperature. The inaccuracy worsens for city driving or lower outside temperature. The reliability of range estimation largely depends on the accuracy of the battery’s underlying state estimators, e.g., state-of-charge and state-of-energy. This is affected by accuracy of the models embedded in the battery management system. The performance of these models fundamentally depends on experimentally obtained parameterisation and validation data. These experiments are mostly performed within thermal chambers, which maintain pre-set temperatures using forced air convection. Although these setups claim to maintain isothermal test conditions, they rarely do so. In this paper, we show that this is potentially the root-cause for deterioration of range estimation at low temperatures. This is because, while such setups produce results comparable to isothermal conditions at higher temperatures (25 °C), they fail to achieve isothermal conditions at sub-zero temperatures. Employing an immersed oil-cooled experimental setup, which can create close-to isothermal conditions, we show battery state estimation can be improved by reducing error from 49.3% to 11.7% at −15 °C. These findings provide a way forward towards improving range estimation in cold weather conditions

    Low temperature performance of Lithium-ion batteries for different drive cycles

    Get PDF
    Lithium-ion batteries, suitable for Battery-electric vehicles (BEVs) due to their high energy and power densities, and lifetime demonstrate deterioration in energy and power available at lower temperatures. It is attributed to reduction in capacity and increase in internal resistance. Investigations are carried out to determine energy, and power decline for four drive-cycles: FTP, NEDC, UDDS and US06. The minimum temperatures where the battery meets the drive-cycles’ energy and power requirements are determined. The impact of regenerativebraking and self-heating on battery performance is discussed. The minimum temperature where any drive-cycle is met by the battery is directly proportional to its aggressiveness

    Distributed thermal monitoring of lithium ion batteries with optical fibre sensors

    Get PDF
    Real-time temperature monitoring of li-ion batteries is widely regarded within the both the academic literature and by the industrial community as being a fundamental requirement for the reliable and safe operation of battery systems. This is particularly evident for larger format pouch cells employed in many automotive or grid storage applications. Traditional methods of temperature measurement, such as the inclusion of individual sensors mounted at discrete locations on the surface of the cell may yield incomplete information. In this study, a novel Rayleigh scattering based optical fibre sensing technology is proposed and demonstrated to deliver a distributed, real-time and accurate measure of temperature that is suitable for use with Li-ion pouch cells. The thermal behaviour of an A5-size pouch cell is experimentally investigated over a wide range of ambient temperatures and electrical load currents, during both charge and discharge. A distributed fibre optical sensor (DFOS) is used to measure both the in-plane temperature difference across the cell surface and the movement of the hottest region of the cell during operation, where temperature difference is the difference of temperature amongst different measuring points. Significantly, the DFOS results highlight that the maximum in-plane temperature difference was found to be up to 307% higher than that measured using traditional a thermocouple approach

    Experimental study of sidewall rupture of cylindrical lithium-ion batteries under radial nail penetration

    Get PDF
    To understand the relationship of the sidewall rupture at different states of charge (SOCs) of cylindrical cells with high specific energy, this work presents the results of radial nail penetration tests of 21700-format cylindrical cells at different SOCs. The thermal runaway and sidewall rupture behaviours were characterised by key performance indicators such as temperature, mass, fire behaviour, and voltage change. In addition, released gases from a subset of tests were measured using the Fourier transform infrared spectroscopy. The change in the internal structure of another subset of cells after the test was observed by X-ray computed tomography. The results show that the sidewall rupture still exists for tests at low SOC (< 30% SOC), but the outcome of thermal runaway and sidewall rupture is milder than those at high SOC (≄ 50% SOC). The average mass loss of cells increases with the increment of SOC. The cell casing thickness is reduced by 12.7% ± 0.3% of the fresh cell, which in combination with the reduction in the strength of the casing material at high temperatures could contribute to sidewall rupture

    A novel methodology to parameterise lithium-ion cell models for low temperature applications

    Get PDF
    Electrification of road transportation is widely recognised as a necessary solution to reduce global warming. However, the mass market adoption of electric vehicles (EVs) has been hindered by reduced battery performance in cold weather conditions leading to warranty limitations coupled with inaccurate range estimation that exacerbates customer range anxiety. Today’s market leading EVs driven on typical UK motorways have a range estimation error up to 27% at an ambient temperature of 10 °C, and due to slower battery kinetics it worsens to 45% at −15 ℃. The range estimation accuracy depends upon the performance of models embedded in the Battery Management System (BMS) which estimates battery states (viz. State-of-Charge (SOC) and Stateof- Energy (SOE)). The performance of the models fundamentally depends upon experimentally obtained parameters at different operating temperatures and currents, and validation exercises against legislative drive cycles. The experiments are usually performed in isothermal conditions by using state-of-the-art climatic chambers that maintain a pre-set temperature by forced air convection. Unfortunately, isothermal conditions are not adhered to as the battery operating temperature deviates significantly from the predefined chamber temperature, especially when battery characterisation is undertaken at low ambient temperatures (≀10 ℃). The aim of this thesis is to propose a novel experimental methodology and alternative modelling approaches to improve the range estimation accuracy of EVs at low ambient temperatures by addressing these shortcomings in existing characterisation methodology. A novel experimental methodology is developed to ensure isothermal conditions using immersed oil baths that provides more accurate usable capacity and energy characteristics of lithium-ion cells, especially at low temperatures, by eliminating the effect of rapid heat generation during battery operation. For the first time, it is shown that model parameterisation using oil-based rather than air-based experiments leads to more accurate estimation of battery states (SOC and SOE). The findings in this thesis suggest that the absolute SOC error is reduced from 13.5% to 5.1% and the absolute SOE error is reduced from 20.6% to 4.3% at −15 °C ambient temperature. A detailed study of heat generation using a battery model utilising the polarisation currents demonstrated improved modelled surface temperature and terminal voltage estimates. These results along with accurate parameterisation data, estimated the battery states and terminal voltage better at low ambient temperatures. A power control approach to battery characterisation ensures that the operating current responds dynamically to the changing cell voltage. A comparison based on energy throughput and peak power demand at low temperatures showed that power control is more representative of real-world applications as compared to current control. Therefore, it is recommended that power control be employed as the primary approach to obtain validation data for cell models. The work demonstrates and provides insight on new aspects for improving the range estimation accuracy of EVs operating under cold weather conditions. Advances from this work enable increased adherence to the rigid environmental conditions necessary for global lithium-ion battery testing standards and battery modelling, leading to the increased uptake of EVs

    State-of-charge estimation algorithms and their implications on cells in parallel

    Get PDF
    State-of-Charge (SOC) of a battery is one of the most important parameters used by a Battery Management System (BMS). It is subsequently very important for the efficient functioning of an electric/hybrid vehicle. Since, the SOC is a derived quantity, it is important that the algorithm, on which the estimator is based, is robust and accurate. The algorithms developed thus far, for different battery chemistries are dependent mainly on the current and voltage data received from individual cells or the entire battery. It is possible that an accurate SOC-estimation algorithm could be the key to increasing the efficiency of a typical hybrid-electric vehicle and hence the real-world applicability. This accuracy should be relevant to both normal and failure cases associated with any drive cycle. This in turn depends on the current and voltage signals received, the positing of such sensors, etc. Also, as battery is comprised of a combination of series and parallel strings, it is important that the difference between the SOCs for an individual cell and that of the entire battery is negligible. The objective of this paper is to compare three SOC estimation algorithms: the current based Coulomb-Counter approach; the voltage dependent model-based approach; and the mixed algorithm that adapts the complementary behavior of the other two methods. In this, the positioning of current and voltage sensors; and the reliability of data, i.e. cell SOC and pack SOC, have been considered. Models created using MATLAB/Simulink, based on literature, have been used. It is seen that the SOC estimation algorithm that is based on both current and voltage data is the most accurate under difference cases of normal and failure cases including short-circuit, open-circuit, etc

    Internal temperature prediction of Lithium-ion cell using differential voltage technique

    Get PDF
    The performance of a Lithium-ion cell is strongly dependent on cell operating temperature. However, the measured temperature is often obtained from thermocouples attached to the surface of the cell. These measurements may not be representative of the internal temperature of the cell especially for lower ambient temperatures and high C-rates. A novel method utilizing differential voltage to predict the internal temperature of a 40 Ah Lithium-ion pouch cell is proposed. The difference between internal and measured external temperatures depends upon the C-rate and ambient temperature. For a continuous-rate discharge, the difference, between surface and measured temperatures, rises at beginning of discharge before peaking in the middle region and reducing towards the end-of-discharge. The outcome of this study could positively support control strategies within a battery management system (BMS)

    Transfer learning LSTM model for battery useful capacity fade prediction

    No full text
    Lithiumion (Li-ion) batteries have become increasingly useful within the automotive industry and modern life applications due to high energy and power densities. However, these batteries suffer capacity loss due to different ageing mechanisms in various applications. Despite several existing models, lack of accurate predictability of capacity degradation limits the advancement of Li-ion batteries. The present work focuses on prediction of battery useful capacity degradation using long-short term memory (LSTM) transfer learning neural network model. At first, a base model was developed and trained using all the (100 % ) degradation data available at 0°C and 10°C environmental temperatures. Thereafter, the training of the base model was fixed, and additional hidden layers were added on top of the base model to further fine tune it with only the initial 30% degradation data available at 25°C environmental temperature. The remaining (70 % ) data of the 25°C case was tested for model prediction. To decide the number of fixed hidden layers to be transferred from base model to transfer model and the number of additional hidden layers on top, an optimization for minimum cross validation error was performed. It was found that the resulting model was able to forecast the remaining battery degradation with 96% accuracy. The model prediction was also compared with LSTM deep learning architecture without using transfer learning. The LSTM with transfer learning model was found to be 17% higher in prediction accuracy than that without utilizing transfer learning
    corecore