3,218 research outputs found

    Online Lifetime Prediction for Lithium-Ion Batteries with Cycle-by-Cycle Updates, Variance Reduction, and Model Ensembling

    Get PDF
    This project was funded by an industry-academia grant EPSRC EP/R511687/1 awarded by EPSRC & University of Edinburgh program Impact Acceleration Account (IAA). R. Ibraheem is a Ph.D. student in EPSRC’s MAC-MIGS Centre for Doctoral Training. MAC-MIGS is supported by the UK’s Engineering and Physical Science Research Council (grant number EP/S023291/1). G. dos Reis acknowledges support from the Faraday Institution [grant number FIRG049]. Publisher Copyright: © 2023 by the authors.Lithium-ion batteries have found applications in many parts of our daily lives. Predicting their remaining useful life (RUL) is thus essential for management and prognostics. Most approaches look at early life prediction of RUL in the context of designing charging profiles or optimising cell design. While critical, said approaches are not directly applicable to the regular testing of cells used in applications. This article focuses on a class of models called ‘one-cycle’ models which are suitable for this task and characterized by versatility (in terms of online prediction frameworks and model combinations), prediction from limited input, and cells’ history independence. Our contribution is fourfold. First, we show the wider deployability of the so-called one-cycle model for a different type of battery data, thus confirming its wider scope of use. Second, reflecting on how prediction models can be leveraged within battery management cloud solutions, we propose a universal Exponential-smoothing (e-forgetting) mechanism that leverages cycle-to-cycle prediction updates to reduce prediction variance. Third, we use this new model as a second-life assessment tool by proposing a knee region classifier. Last, using model ensembling, we build a “model of models”. We show that it outperforms each underpinning model (from in-cycle variability, cycle-to-cycle variability, and empirical models). This ‘ensembling’ strategy allows coupling explainable and black-box methods, thus giving the user extra control over the final model.publishersversionpublishe

    Situational awareness-based energy management for unmanned electric surveillance platforms

    Get PDF
    In the present day fossil fuel availability, cost, security and the pollutant emissions resulting from its use have driven industry into looking for alternative ways of powering vehicles. The aim of this research is to synthesize/design and develop a framework of novel control architectures which can result in complex powered vehicle subsystems to perform better with reduced exogeneuous information. This research looks into the area of energy management by proposing an intelligent based system which not only looks at the beaten path of where energy comes from and how much of it to use, but it goes further by taking into consideration the world around it. By operating without GPS, it realies on data such as usage, average consumption, system loads and even other surrounding vehicles are considered when making the difficult decisions of where to direct the energy into, how much of it, and even when to cut systems off in benefit of others. All this is achieved in an integrated way by working within the limitations of non-fossil fuelled energy sources like fuel cells, ultracapacitors and battery banks using driver-provided information or by crafting an artificial usage profile from historicaly learnt data. By using an organic computing philosophy based on artificial intelligence this alternative approach to energy supply systems presents a different perspective beginning by accepting the fact that when hardware is set energy can be optimized only so much and takes a step further by answering the question of how to best manage it when refuelling might not be an option. The result is a situationally aware system concept that is portable to any type of electrically powered platform be it ground, aerial or marine since it operates on the fact that all operate within three dimensional space. The system´s capabilities are then verified in a virtual reality environment which can be tailored to the meet reseach needs including allowing for different altitudes, environmental temperature and humidity profiles. This VR system is coupled with a chassis dynamometer to allow for testing of real physical prototype unmanned ground vehicles where the intelligent system will benefit by learning from real platform data. The Thesis contributions and objectives are summarised next: The control system proposed includes an awareness of the surroundings within which the vehicle is operating without relying on GPS position information. The system proposed is portable and could be used to control other systems. The test platform developed within the Thesis is flexible and could be used for other systems. The control system for the fuel cell system described within the work has included an allowance for altitude and humidity. These factors would appear to be significant for such systems. The structure of the control system and its hierarchy is novel. The ability of the system to be applied to a UAV and as such control a ‘vehicle’ in 3 dimensions, and yet be also applied to a ground vehicle, where roll and pitch are largely a function of the ground over which it travels (so the UGV only uses a subset of the control functionality). The mission awareness of the control structure appears to be the heart of the potential contribution to knowledge, and that this also includes the ability to create an estimated, artificial mission profile should one not be input by the operators. This learnt / adaptive input could be expanded on to highlight this aspect

    Explainable machine learning for project management control

    Get PDF
    Project control is a crucial phase within project management aimed at ensuring —in an integrated manner— that the project objectives are met according to plan. Earned Value Management —along with its various refinements— is the most popular and widespread method for top-down project control. For project control under uncertainty, Monte Carlo simulation and statistical/machine learning models extend the earned value framework by allowing the analysis of deviations, expected times and costs during project progress. Recent advances in explainable machine learning, in particular attribution methods based on Shapley values, can be used to link project control to activity properties, facilitating the interpretation of interrelations between activity characteristics and control objectives. This work proposes a new methodology that adds an explainability layer based on SHAP —Shapley Additive exPlanations— to different machine learning models fitted to Monte Carlo simulations of the project network during tracking control points. Specifically, our method allows for both prospective and retrospective analyses, which have different utilities: forward analysis helps to identify key relationships between the different tasks and the desired outcomes, thus being useful to make execution/replanning decisions; and backward analysis serves to identify the causes of project status during project progress. Furthermore, this method is general, model-agnostic and provides quantifiable and easily interpretable information, hence constituting a valuable tool for project control in uncertain environments

    Energy planning and forecasting approaches for supporting physical improvement strategies in the building sector: a review

    Get PDF
    The strict CO2 emission targets set to tackle the global climate change associated with greenhouse gas emission exerts so much pressure on our cities which contribute up to 75% of the global carbon dioxide emission level, with buildings being the largest contributor (UNEP, 2015). Premised on this fact, urban planners are required to implement proactive energy planning strategies not only to meet these targets but also ensure that future cities development is performed in a way that promotes energy-efficiency. This article gives an overview of the state-of-art of energy planning and forecasting approaches for aiding physical improvement strategies in the building sector. Unlike previous reviews, which have only addressed the strengths as well as weaknesses of some of the approaches while referring to some relevant examples from the literature, this article focuses on critically analysing more approaches namely; 2D GIS and 3DGIS (CityGML) based energy prediction approaches, based on their frequent intervention scale, applicability in the building life cycle, and conventional prediction process. This will be followed by unravelling the gaps and issues pertaining to the reviewed approaches. Finally, based on the identified problems, future research prospects are recommended

    Data center's telemetry reduction and prediction through modeling techniques

    Get PDF
    Nowadays, Cloud Computing is widely used to host and deliver services over the Internet. The architecture of clouds is complex due to its heterogeneous nature of hardware and is hosted in large scale data centers. To effectively and efficiently manage such complex infrastructure, constant monitoring is needed. This monitoring generates large amounts of telemetry data streams (e.g. hardware utilization metrics) which are used for multiple purposes including problem detection, resource management, workload characterization, resource utilization prediction, capacity planning, and job scheduling. These telemetry streams require costly bandwidth utilization and storage space particularly at medium-long term for large data centers. Moreover, accurate future estimation of these telemetry streams is a challenging task due to multi-tenant co-hosted applications and dynamic workloads. The inaccurate estimation leads to either under or over-provisioning of data center resources. In this Ph.D. thesis, we propose to improve the prediction accuracy and reduce the bandwidth utilization and storage space requirement with the help of modeling and prediction methods from machine learning. Most of the existing methods are based on a single model which often does not appropriately estimate different workload scenarios. Moreover, these prediction methods use a fixed size of observation windows which cannot produce accurate results because these are not adaptively adjusted to capture the local trends in the recent data. Therefore, the estimation method trains on fixed sliding windows use an irrelevant large number of observations which yields inaccurate estimations. In summary, we C1) efficiently reduce bandwidth and storage for telemetry data through real-time modeling using Markov chain model. C2) propose a novel method to adaptively and automatically identify the most appropriate model to accurately estimate data center resources utilization. C3) propose a deep learning-based adaptive window size selection method which dynamically limits the sliding window size to capture the local trend in the latest resource utilization for building estimation model.Hoy en día, Cloud Computing se usa ampliamente para alojar y prestar servicios a través de Internet. La arquitectura de las nubes es compleja debido a su naturaleza heterogénea del hardware y está alojada en centros de datos a gran escala. Para administrar de manera efectiva y eficiente dicha infraestructura compleja, se necesita un monitoreo constante. Este monitoreo genera grandes cantidades de flujos de datos de telemetría (por ejemplo, métricas de utilización de hardware) que se utilizan para múltiples propósitos, incluyendo detección de problemas, gestión de recursos, caracterización de carga de trabajo, predicción de utilización de recursos, planificación de capacidad y programación de trabajos. Estas transmisiones de telemetría requieren una utilización costosa del ancho de banda y espacio de almacenamiento, particularmente a mediano y largo plazo para grandes centros de datos. Además, la estimación futura precisa de estas transmisiones de telemetría es una tarea difícil debido a las aplicaciones cohospedadas de múltiples inquilinos y las cargas de trabajo dinámicas. La estimación inexacta conduce a un suministro insuficiente o excesivo de los recursos del centro de datos. En este Ph.D. En la tesis, proponemos mejorar la precisión de la predicción y reducir la utilización del ancho de banda y los requisitos de espacio de almacenamiento con la ayuda de métodos de modelado y predicción del aprendizaje automático. La mayoría de los métodos existentes se basan en un modelo único que a menudo no estima adecuadamente diferentes escenarios de carga de trabajo. Además, estos métodos de predicción utilizan un tamaño fijo de ventanas de observación que no pueden producir resultados precisos porque no se ajustan adaptativamente para capturar las tendencias locales en los datos recientes. Por lo tanto, el método de estimación entrena en ventanas corredizas fijas utiliza un gran número de observaciones irrelevantes que produce estimaciones inexactas. En resumen, C1) reducimos eficientemente el ancho de banda y el almacenamiento de datos de telemetría a través del modelado en tiempo real utilizando el modelo de cadena de Markov. C2) proponer un método novedoso para identificar de forma adaptativa y automática el modelo más apropiado para estimar con precisión la utilización de los recursos del centro de datos. C3) proponer un método de selección de tamaño de ventana adaptativo basado en el aprendizaje profundo que limita dinámicamente el tamaño de ventana deslizante para capturar la tendencia local en la última utilización de recursos para el modelo de estimación de construcción.Postprint (published version

    Bu y on Intraday Market or not: A Deep Learning Approach :A decision tool for buyers in the Norwegian electricity markets to decide optimal market to purchase electricity

    Get PDF
    As the share of variable renewable energy sources increases, so does the need for near-delivery offloading of surplus electricity. The availability of potentially cheap energy sources in intraday markets begs warrants the reconsideration of a potentially overlooked market. From a power buying perspective, this thesis has applied promising deep neural network techniques to produce accurate electricity price forecasts before day-ahead market closure. Architectures tested in this thesis include long short-term memory (LSTM), gated recurrent units (GRU), deep autoregressive models (DeepAR) and temporal fusion transformers (TFT). Using nested cross-validation scheme, we seek to better approximate the generalization error of our models. LSTM and GRU models are found to be the best performing, in day-ahead and intraday markets, beating the benchmark measured in MAE by 30.6 % and 29 %, respectively. The increase in performance achieved by deep neural architectures are found to be particularly prominent in periods of high price volatility. Our overall goal has been the creation of decision tool, to be used by an electricity buyer to determine optimal electricity market for a given set of delivery hours. The results presented in this thesis are based on the NO2 power region (South Norway) as a result of its relative intraday liquidity. We implement the decision tool by means of a a probabilistic classifier trained specifically on the forecasts of the optimal deep neural architectures. We find that the use of a probabilistic classifier increase classification performance when compared to using sign-difference of the forecasts directly. Despite numerous potential error sources, our decision tool is shown to increase expected marginal profits when compared to a day-ahead-only trading strategy by testing in a out-ofsample simulated “production” environment. We model a decision tool to fit the needs of various risk profiles, and find that higher risk tolerance warrants higher profits. Though beyond the scope of this thesis, the general outline of this decision tool can be modified and extended to fit the needs of power producers.nhhma
    corecore