1,200 research outputs found

    On the auxiliary particle filter

    Full text link
    In this article we study asymptotic properties of weighted samples produced by the auxiliary particle filter (APF) proposed by pitt and shephard (1999). Besides establishing a central limit theorem (CLT) for smoothed particle estimates, we also derive bounds on the Lp error and bias of the same for a finite particle sample size. By examining the recursive formula for the asymptotic variance of the CLT we identify first-stage importance weights for which the increase of asymptotic variance at a single iteration of the algorithm is minimal. In the light of these findings, we discuss and demonstrate on several examples how the APF algorithm can be improved.Comment: 26 page

    Long-term stability of sequential Monte Carlo methods under verifiable conditions

    Full text link
    This paper discusses particle filtering in general hidden Markov models (HMMs) and presents novel theoretical results on the long-term stability of bootstrap-type particle filters. More specifically, we establish that the asymptotic variance of the Monte Carlo estimates produced by the bootstrap filter is uniformly bounded in time. On the contrary to most previous results of this type, which in general presuppose that the state space of the hidden state process is compact (an assumption that is rarely satisfied in practice), our very mild assumptions are satisfied for a large class of HMMs with possibly noncompact state space. In addition, we derive a similar time uniform bound on the asymptotic Lp\mathsf{L}^p error. Importantly, our results hold for misspecified models; that is, we do not at all assume that the data entering into the particle filter originate from the model governing the dynamics of the particles or not even from an HMM.Comment: Published in at http://dx.doi.org/10.1214/13-AAP962 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Who Listens to a Stock Index?

    Get PDF
    Previous findings on the subject of Class Biased Economic Voting (CBEV) suggests that voters who are not among the wealthiest elite respond positively, in terms of probability of voting for the incumbent party or president, to income growth among the wealthiest 5% of households, and more so than to mean income growth. The aim of this paper is to explore if this type of bias voting is due to voters paying attention to macroeconomic variables that are correlated with economic fortunes of the wealthiest elites. It sets out to answer two questions: 1. Does stock index performance during election year effect CBEV? 2. Does stock index performance increase the probability of voting for the incumbent party or president? The study employs an individual level cross-sectional probit model using two measurements of income-growth alongside figures of stock index performance. Results indicate that stock index has an impact on probability of voting for the incumbent party/ president in France and Sweden but not in the United Kingdom (U.K.) Whether or not the stock index causes CBEV is difficult to infer, mainly due to the U.K’s responses to the stock market are statistically insignificant, as well insignificant results from the French electorate to income-growth.Previous findings on the subject of Class Biased Economic Voting (CBEV) suggests that voters who are not among the wealthiest elite respond positively, in terms of probability of voting for the incumbent party or president, to income growth among the wealthiest 5% of households, and more so than to mean income growth. The aim of this paper is to explore if this type of bias voting is due to voters paying attention to macroeconomic variables that are correlated with economic fortunes of the wealthiest elites. It sets out to answer two questions: 1. Does stock index performance during election year effect CBEV? 2. Does stock index performance increase the probability of voting for the incumbent party or president? The study employs an individual level cross-sectional probit model using two measurements of income-growth alongside figures of stock index performance. Results indicate that stock index has an impact on probability of voting for the incumbent party/ president in France and Sweden but not in the United Kingdom (U.K.) Whether or not the stock index causes CBEV is difficult to infer, mainly due to the U.K’s responses to the stock market are statistically insignificant, as well insignificant results from the French electorate to income-growth

    Sequential Monte Carlo smoothing for general state space hidden Markov models

    Full text link
    Computing smoothing distributions, the distributions of one or more states conditional on past, present, and future observations is a recurring problem when operating on general hidden Markov models. The aim of this paper is to provide a foundation of particle-based approximation of such distributions and to analyze, in a common unifying framework, different schemes producing such approximations. In this setting, general convergence results, including exponential deviation inequalities and central limit theorems, are established. In particular, time uniform bounds on the marginal smoothing error are obtained under appropriate mixing conditions on the transition kernel of the latent chain. In addition, we propose an algorithm approximating the joint smoothing distribution at a cost that grows only linearly with the number of particles.Comment: Published in at http://dx.doi.org/10.1214/10-AAP735 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org). arXiv admin note: text overlap with arXiv:1012.4183 by other author

    Sequential Monte Carlo smoothing with application to parameter estimation in non-linear state space models

    Full text link
    This paper concerns the use of sequential Monte Carlo methods (SMC) for smoothing in general state space models. A well-known problem when applying the standard SMC technique in the smoothing mode is that the resampling mechanism introduces degeneracy of the approximation in the path space. However, when performing maximum likelihood estimation via the EM algorithm, all functionals involved are of additive form for a large subclass of models. To cope with the problem in this case, a modification of the standard method (based on a technique proposed by Kitagawa and Sato) is suggested. Our algorithm relies on forgetting properties of the filtering dynamics and the quality of the estimates produced is investigated, both theoretically and via simulations.Comment: Published in at http://dx.doi.org/10.3150/07-BEJ6150 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Convergence properties of weighted particle islands with application to the double bootstrap algorithm

    Get PDF
    Particle island models (Verg\'e et al., 2013) provide a means of parallelization of sequential Monte Carlo methods, and in this paper we present novel convergence results for algorithms of this sort. In particular we establish a central limit theorem - as the number of islands and the common size of the islands tend jointly to infinity - of the double bootstrap algorithm with possibly adaptive selection on the island level. For this purpose we introduce a notion of archipelagos of weighted islands and find conditions under which a set of convergence properties are preserved by different operations on such archipelagos. This theory allows arbitrary compositions of these operations to be straightforwardly analyzed, providing a very flexible framework covering the double bootstrap algorithm as a special case. Finally, we establish the long-term numerical stability of the double bootstrap algorithm by bounding its asymptotic variance under weak and easily checked assumptions satisfied for a wide range of models with possibly non-compact state space

    Consistency of the maximum likelihood estimator for general hidden Markov models

    Full text link
    Consider a parametrized family of general hidden Markov models, where both the observed and unobserved components take values in a complete separable metric space. We prove that the maximum likelihood estimator (MLE) of the parameter is strongly consistent under a rather minimal set of assumptions. As special cases of our main result, we obtain consistency in a large class of nonlinear state space models, as well as general results on linear Gaussian state space models and finite state models. A novel aspect of our approach is an information-theoretic technique for proving identifiability, which does not require an explicit representation for the relative entropy rate. Our method of proof could therefore form a foundation for the investigation of MLE consistency in more general dependent and non-Markovian time series. Also of independent interest is a general concentration inequality for VV-uniformly ergodic Markov chains.Comment: Published in at http://dx.doi.org/10.1214/10-AOS834 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Change, reorganization and quality of home carefor elderly people in Sweden during the 1990s : paper to 17:e Nordiska konferensen i gerontologi 23-26 maj 2004 i Stockholm

    Get PDF
    During the 1990s several kinds of reorganization has taken place in public services in Sweden. Reorganizations according to the idea of market economy have been most salient and debated. In many municipalities private companies have started to organize home care service for elderly, financed by tax. Public home care organizations have reorganized the working organization and managing structure, partly to save money and partly to reach better quality of care. The home care services have a key role in the care of the elderly in the society. The quality of care is important for the possibility for elderly to stay as long as possible in their ordinary homes. We have performed a longitudinal study of the reorganizations during the 1990s in order to explore the consequences for quality of care. The study was performed in seven districts in three different municipalities representing different types of municipalities and different kinds of reorganizations. It comprises private companies, traditional public organizations, public organizations with changed managing structure and co-ordination of home help and home health care. Quality of care is studied through assessment of the communication in the organization, the psychosocial working environment of the caregivers and the quality of the care work. Politicians, managers, ca. 100 care givers and ca. 500 elderly receiving help and care have been interviewed four times during the period (1993, 1995, 1997 and 2002/2003). The traditional organization within small districts with small autonomous working teams and easy accessible supervisor expose the best quality. There are no unambiguous differences between public and private organizations. There is however a tendency for successive decline of quality for private companies after they have been established. Other aspects of the reorganizations, e. g. to have special officials for assessment of help need, do not contribute to better quality but create new communication gaps in the organization and have negative influence on the working environment. Stability in the composition of the care worker teams, leadership, decision-making processes in the organization and the district area promote high quality of care. The reorganizations during the 1990s seem on the whole to function contrary to promotion of high quality in the care and service for the elderly

    Stochastic simulation of rain intrusion through small defects due to water rivulet overpressure. Introducing a driving rain leakage potential

    Get PDF
    There is a need of upgrading the old building stock with respect to the thermal insulation of the building envelope and specifically the fa\ue7ades. There are several systems on the market, and some are quite new and innovative. To bring down the cost some of the systems many are based on prefabricated moisture tight insulated units. This means that in case there is moisture tight barrier on the interior side, two moisture tight barriers surround the wall structure. The leakage of driving rain into the structure then represents a major threat to the durability of these systems. This paper investigates the pressure build up in water rivulets running down a fa\ue7ade acting together with the wind pressure. A driving rain leakage potential is introduced. Using real weather data years and Monte Carlo Simulations, the mean and standard deviation of the annual leakage through small hole is estimated. The examples show that the leakage can reach a level 0-0.5 liter/year for a hole with a diameter of 1-2 mm, and 0.5-3 liter/year for a diameter of 3-4 mm
    • …
    corecore