37,339 research outputs found

    Model migration neural network for predicting battery aging trajectories

    Get PDF
    Accurate prediction of batteries’ future degradation is a key solution to relief users’ anxiety on battery lifespan and electric vehicle’s driving range. Technical challenges arise from the highly nonlinear dynamics of battery aging. In this paper, a feed-forward migration neural network is proposed to predict the batteries’ aging trajectories. Specifically, a base model that describes the capacity decay over time is first established from the existed battery aging dataset. This base model is then transformed by an input-output slope-and-bias-correction (SBC) method structure to capture the degradation of target cell. To enhance the model’s nonlinear transfer capability, the SBC-model is further integrated into a four-layer neural network, and easily trained via the gradient correlation algorithm. The proposed migration neural network is experimentally verified with four different commercial batteries. The predicted RMSEs are all lower than 2.5% when using only the first 30% of aging trajectories for neural network training. In addition, illustrative results demonstrate that a small size feed-forward neural network (down to 1-5-5-1) is sufficient for battery aging trajectory prediction

    A Comparative Study of Reservoir Computing for Temporal Signal Processing

    Get PDF
    Reservoir computing (RC) is a novel approach to time series prediction using recurrent neural networks. In RC, an input signal perturbs the intrinsic dynamics of a medium called a reservoir. A readout layer is then trained to reconstruct a target output from the reservoir's state. The multitude of RC architectures and evaluation metrics poses a challenge to both practitioners and theorists who study the task-solving performance and computational power of RC. In addition, in contrast to traditional computation models, the reservoir is a dynamical system in which computation and memory are inseparable, and therefore hard to analyze. Here, we compare echo state networks (ESN), a popular RC architecture, with tapped-delay lines (DL) and nonlinear autoregressive exogenous (NARX) networks, which we use to model systems with limited computation and limited memory respectively. We compare the performance of the three systems while computing three common benchmark time series: H{\'e}non Map, NARMA10, and NARMA20. We find that the role of the reservoir in the reservoir computing paradigm goes beyond providing a memory of the past inputs. The DL and the NARX network have higher memorization capability, but fall short of the generalization power of the ESN

    A hierarchical anti-Hebbian network model for the formation of spatial cells in three-dimensional space.

    Get PDF
    Three-dimensional (3D) spatial cells in the mammalian hippocampal formation are believed to support the existence of 3D cognitive maps. Modeling studies are crucial to comprehend the neural principles governing the formation of these maps, yet to date very few have addressed this topic in 3D space. Here we present a hierarchical network model for the formation of 3D spatial cells using anti-Hebbian network. Built on empirical data, the model accounts for the natural emergence of 3D place, border, and grid cells, as well as a new type of previously undescribed spatial cell type which we call plane cells. It further explains the plausible reason behind the place and grid-cell anisotropic coding that has been observed in rodents and the potential discrepancy with the predicted periodic coding during 3D volumetric navigation. Lastly, it provides evidence for the importance of unsupervised learning rules in guiding the formation of higher-dimensional cognitive maps
    • …
    corecore