26 research outputs found

    Evolino for recurrent support vector machines

    Full text link
    Traditional Support Vector Machines (SVMs) need pre-wired finite time windows to predict and classify time series. They do not have an internal state necessary to deal with sequences involving arbitrary long-term dependencies. Here we introduce a new class of recurrent, truly sequential SVM-like devices with internal adaptive states, trained by a novel method called EVOlution of systems with KErnel-based outputs (Evoke), an instance of the recent Evolino class of methods. Evoke evolves recurrent neural networks to detect and represent temporal dependencies while using quadratic programming/support vector regression to produce precise outputs. Evoke is the first SVM-based mechanism learning to classify a context-sensitive language. It also outperforms recent state-of-the-art gradient-based recurrent neural networks (RNNs) on various time series prediction tasks.Comment: 10 pages, 2 figure

    Development of a neural network-based energy management system for a plug-in hybrid electric vehicle

    Get PDF
    The high potential of Artificial Intelligence (AI) techniques for effectively solving complex parameterization tasks also makes them extremely attractive for the design of the Energy Management Systems (EMS) of Hybrid Electric Vehicles (HEVs). In this framework, this paper aims to design an EMS through the exploitation of deep learning techniques, which allow high non-linear relationships among the data characterizing the problem to be described. In particular, the deep learning model was designed employing two different Recurrent Neural Networks (RNNs). First, a previously developed digital twin of a state-of-the-art plug-in HEV was used to generate a wide portfolio of Real Driving Emissions (RDE) compliant vehicle missions and traffic scenarios. Then, the AI models were trained off-line to achieve CO2 emissions minimization providing the optimal solutions given by a global optimization control algorithm, namely Dynamic Programming (DP). The proposed methodology has been tested on a virtual test rig and it has been proven capable of achieving significant improvements in terms of fuel economy for both charge-sustaining and charge-depleting strategies, with reductions of about 4% and 5% respectively if compared to the baseline Rule-Based (RB) strategy

    On the importance of sluggish state memory for learning long term dependency

    Get PDF
    The vanishing gradients problem inherent in Simple Recurrent Networks (SRN) trained with back-propagation, has led to a significant shift towards the use of Long Short-term Memory (LSTM) and Echo State Networks (ESN), which overcome this problem through either second order error-carousel schemes or different learning algorithms respectively. This paper re-opens the case for SRN-based approaches, by considering a variant, the Multi-recurrent Network (MRN). We show that memory units embedded within its architecture can ameliorate against the vanishing gradient problem, by providing variable sensitivity to recent and more historic information through layer- and self-recurrent links with varied weights, to form a so-called sluggish state-based memory. We demonstrate that an MRN, optimised with noise injection, is able to learn the long term dependency within a complex grammar induction task, significantly outperforming the SRN, NARX and ESN. Analysis of the internal representations of the networks, reveals that sluggish state-based representations of the MRN are best able to latch on to critical temporal dependencies spanning variable time delays, to maintain distinct and stable representations of all underlying grammar states. Surprisingly, the ESN was unable to fully learn the dependency problem, suggesting the major shift towards this class of models may be premature

    Energy management system optimization based on an LSTM deep learning model using vehicle speed prediction

    Get PDF
    The energy management of a Hybrid Electric Vehicle (HEV) is a global optimization problem, and its optimal solution inevitably entails knowing the entire mission profile. The exploitation of Vehicle-to-Everything (V2X) connectivity can pave the way for reliable short-term vehicle speed predictions. As a result, the capabilities of conventional energy management strategies can be enhanced by integrating the predicted vehicle speed into the powertrain control strategy. Therefore, in this paper, an innovative Adaptation algorithm uses the predicted speed profile for an Equivalent Consumption Minimization Strategy (A-V2X-ECMS). Driving pattern identification is employed to adapt the equivalence factor of the ECMS when a change in the driving patterns occurs, or when the State of Charge (SoC) experiences a high deviation from the target value. A Principal Component Analysis (PCA) was performed on several energetic indices to select the ones that predominate in characterizing the different driving patterns. Long Short-Term Memory (LSTM) deep neural networks were trained to choose the optimal value of the equivalence factor for a specific sequence of data (i.e., speed, acceleration, power, and initial SoC). The potentialities of the innovative A-V2X-ECMS were assessed, through numerical simulation, on a diesel Plug-in Hybrid Electric Vehicle (PHEV) available on the European market. A virtual test rig of the investigated vehicle was built in the GT-SUITE software environment and validated against a wide database of experimental data. The simulations proved that the proposed approach achieves results much closer to optimal than the conventional energy management strategies taken as a reference
    corecore