4,371 research outputs found

    An agent-based model to explore scenarios of adaptation to climate change in an alpine tourism destination

    Full text link
    The European Alpine region is one of the most sensitive to climate change impacts. ClimAlpTour is a European research project of the Alpine Space Programme, dealing with the expected decrease in snow and ice cover. The research reported herein analyses the municipality of Auronzo di Cadore (22,000 ha) in the Dolomites. The local economy depends on tourism which is currently focused on the summer season. Since recently the Community Council is considering options on how to stimulate a further development of the winter tourism. This paper refers to a prototype agent-based model, called AuronzoWinSim, for the assessment of alternative scenarios of future local development, taking into account complex spatial and social dynamics and interactions. Different typologies of winter tourists compose the set of human agents. Climate change scenarios are used to produce snow cover projections. AuronzoWinSim is planned for use in a participatory context with groups of local stakeholders. (Résumé d'auteur

    A statistical model for in vivo neuronal dynamics

    Get PDF
    Single neuron models have a long tradition in computational neuroscience. Detailed biophysical models such as the Hodgkin-Huxley model as well as simplified neuron models such as the class of integrate-and-fire models relate the input current to the membrane potential of the neuron. Those types of models have been extensively fitted to in vitro data where the input current is controlled. Those models are however of little use when it comes to characterize intracellular in vivo recordings since the input to the neuron is not known. Here we propose a novel single neuron model that characterizes the statistical properties of in vivo recordings. More specifically, we propose a stochastic process where the subthreshold membrane potential follows a Gaussian process and the spike emission intensity depends nonlinearly on the membrane potential as well as the spiking history. We first show that the model has a rich dynamical repertoire since it can capture arbitrary subthreshold autocovariance functions, firing-rate adaptations as well as arbitrary shapes of the action potential. We then show that this model can be efficiently fitted to data without overfitting. Finally, we show that this model can be used to characterize and therefore precisely compare various intracellular in vivo recordings from different animals and experimental conditions.Comment: 31 pages, 10 figure

    A Spatial Agent-Based Model to Explore Scenarios of Adaptation to Climate Change in an Alpine Tourism Destination

    Get PDF
    A vast body of literature suggests that the European Alpine region may be one of the most sensitive to climate change impacts. Adaptation to climate change of Alpine socio-ecosystems is increasingly becoming an issue of interest for the scientific community while the people of the Alps are often unaware of or simply ignore the problem. ClimAlpTour is a European research project of the Alpine Space Programme, bringing together institutions and scholars from all countries of the Alpine arch, in view of dealing with the expected decrease in snow and ice cover, which may lead to a rethinking of tourism development beyond the traditional vision of winter sports. The research reported herein analyses the municipality of Auronzo di Cadore (22,000 ha) in the Dolomites under the famous peaks of the “Tre Cime di Lavaredo”. The local economy depends on tourism which is currently focused on the summer season, while the winter season is weak. As a whole, the destination receives approximately 65,000 guests per year with a resident population of 3,600 inhabitants. Since recently the Community Council is considering options on how to stimulate a further development of the winter tourism. This paper refers to a prototype agent-based model, called AuronzoWinSim, for the assessment of alternative scenarios of future local development strategies, taking into account complex spatial and social dynamics and interactions. Different typologies of winter tourists compose the set of human agents. Climate change scenarios are used to produce temperature and snow cover projections. The model is mainly informed by secondary sources, including demographic and economic time series, and biophysical data which feed-in its spatial dimension. Primary data from field surveys are used to calibrate the main parameters. AuronzoWinSim is planned for use in a participatory context with groups of local stakeholders.Alpine Winter Tourism, Spatial Agent-Based Model, Climate Change Adaptation

    The Neural Particle Filter

    Get PDF
    The robust estimation of dynamically changing features, such as the position of prey, is one of the hallmarks of perception. On an abstract, algorithmic level, nonlinear Bayesian filtering, i.e. the estimation of temporally changing signals based on the history of observations, provides a mathematical framework for dynamic perception in real time. Since the general, nonlinear filtering problem is analytically intractable, particle filters are considered among the most powerful approaches to approximating the solution numerically. Yet, these algorithms prevalently rely on importance weights, and thus it remains an unresolved question how the brain could implement such an inference strategy with a neuronal population. Here, we propose the Neural Particle Filter (NPF), a weight-less particle filter that can be interpreted as the neuronal dynamics of a recurrently connected neural network that receives feed-forward input from sensory neurons and represents the posterior probability distribution in terms of samples. Specifically, this algorithm bridges the gap between the computational task of online state estimation and an implementation that allows networks of neurons in the brain to perform nonlinear Bayesian filtering. The model captures not only the properties of temporal and multisensory integration according to Bayesian statistics, but also allows online learning with a maximum likelihood approach. With an example from multisensory integration, we demonstrate that the numerical performance of the model is adequate to account for both filtering and identification problems. Due to the weightless approach, our algorithm alleviates the 'curse of dimensionality' and thus outperforms conventional, weighted particle filters in higher dimensions for a limited number of particles

    The Hitchhiker's Guide to Nonlinear Filtering

    Get PDF
    Nonlinear filtering is the problem of online estimation of a dynamic hidden variable from incoming data and has vast applications in different fields, ranging from engineering, machine learning, economic science and natural sciences. We start our review of the theory on nonlinear filtering from the simplest `filtering' task we can think of, namely static Bayesian inference. From there we continue our journey through discrete-time models, which is usually encountered in machine learning, and generalize to and further emphasize continuous-time filtering theory. The idea of changing the probability measure connects and elucidates several aspects of the theory, such as the parallels between the discrete- and continuous-time problems and between different observation models. Furthermore, it gives insight into the construction of particle filtering algorithms. This tutorial is targeted at scientists and engineers and should serve as an introduction to the main ideas of nonlinear filtering, and as a segway to more advanced and specialized literature.Comment: 64 page

    Online Maximum Likelihood Estimation of the Parameters of Partially Observed Diffusion Processes

    Full text link
    We revisit the problem of estimating the parameters of a partially observed diffusion process, consisting of a hidden state process and an observed process, with a continuous time parameter. The estimation is to be done online, i.e. the parameter estimate should be updated recursively based on the observation filtration. Here, we use an old but under-exploited representation of the incomplete-data log-likelihood function in terms of the filter of the hidden state from the observations. By performing a stochastic gradient ascent, we obtain a fully recursive algorithm for the time evolution of the parameter estimate. We prove the convergence of the algorithm under suitable conditions regarding the ergodicity of the process consisting of state, filter, and tangent filter. Additionally, our parameter estimation is shown numerically to have the potential of improving suboptimal filters, and can be applied even when the system is not identifiable due to parameter redundancies. Online parameter estimation is a challenging problem that is ubiquitous in fields such as robotics, neuroscience, or finance in order to design adaptive filters and optimal controllers for unknown or changing systems

    Stall Pattern Avoidance in Polynomial Product Codes

    Full text link
    Product codes are a concatenated error-correction scheme that has been often considered for applications requiring very low bit-error rates, which demand that the error floor be decreased as much as possible. In this work, we consider product codes constructed from polynomial algebraic codes, and propose a novel low-complexity post-processing technique that is able to improve the error-correction performance by orders of magnitude. We provide lower bounds for the error rate achievable under post processing, and present simulation results indicating that these bounds are tight.Comment: 4 pages, 2 figures, GlobalSiP 201

    How to avoid the curse of dimensionality: scalability of particle filters with and without importance weights

    Full text link
    Particle filters are a popular and flexible class of numerical algorithms to solve a large class of nonlinear filtering problems. However, standard particle filters with importance weights have been shown to require a sample size that increases exponentially with the dimension D of the state space in order to achieve a certain performance, which precludes their use in very high-dimensional filtering problems. Here, we focus on the dynamic aspect of this curse of dimensionality (COD) in continuous time filtering, which is caused by the degeneracy of importance weights over time. We show that the degeneracy occurs on a time-scale that decreases with increasing D. In order to soften the effects of weight degeneracy, most particle filters use particle resampling and improved proposal functions for the particle motion. We explain why neither of the two can prevent the COD in general. In order to address this fundamental problem, we investigate an existing filtering algorithm based on optimal feedback control that sidesteps the use of importance weights. We use numerical experiments to show that this Feedback Particle Filter (FPF) by Yang et al. (2013) does not exhibit a COD

    GRB 980425, SN1998bw and the EMBH model

    Full text link
    The EMBH model, previously developed using GRB 991216 as a prototype, is here applied to GRB 980425. We fit the luminosity observed in the 40-700 keV, 2-26 keV and 2-10 keV bands by the BeppoSAX satellite. In addition we present a novel scenario in which the supernova SN1998bw is the outcome of an ``induced gravitational collapse'' triggered by GRB 980425, in agreement with the GRB-Supernova Time Sequence (GSTS) paradigm (Ruffini et al. 2001c). A further outcome of this astrophysically exceptional sequence of events is the formation of a young neutron star generated by the SN1998bw event. A coordinated observational activity is recommended to further enlighten the underlying scenario of this most unique astrophysical system.Comment: 10 pages, 3 figures, in the Proceedings of the 34th COSPAR scientific assembly, Elsevier. Fixed some typos in this new versio

    AIDI: An adaptive image denoising FPGA-based IP-core for real-time applications

    Get PDF
    The presence of noise in images can significantly impact the performances of digital image processing and computer vision algorithms. Thus, it should be removed to improve the robustness of the entire processing flow. The noise estimation in an image is also a key factor, since, to be more effective, algorithms and denoising filters should be tuned to the actual level of noise. Moreover, the complexity of these algorithms brings a new challenge in real-time image processing applications, requiring high computing capacity. In this context, hardware acceleration is crucial, and Field Programmable Gate Arrays (FPGAs) best fit the growing demand of computational capabilities. This paper presents an Adaptive Image Denoising IP-core (AIDI) for real-time applications. The core first estimates the level of noise in the input image, then applies an adaptive Gaussian smoothing filter to remove the estimated noise. The filtering parameters are computed on-the-fly, adapting them to the level of noise in the image, and pixel by pixel, to preserve image information (e.g., edges or corners). The FPGA-based architecture is presented, highlighting its improvements w.r.t. a standard static filtering approac
    corecore