13,396 research outputs found

    Distributed Real-Time Emulation of Formally-Defined Patterns for Safe Medical Device Control

    Full text link
    Safety of medical devices and of their interoperation is an unresolved issue causing severe and sometimes deadly accidents for patients with shocking frequency. Formal methods, particularly in support of highly reusable and provably safe patterns which can be instantiated to many device instances can help in this regard. However, this still leaves open the issue of how to pass from their formal specifications in logical time to executable emulations that can interoperate in physical time with other devices and with simulations of patient and/or doctor behaviors. This work presents a specification-based methodology in which virtual emulation environments can be easily developed from formal specifications in Real-Time Maude, and can support interactions with other real devices and with simulation models. This general methodology is explained in detail and is illustrated with two concrete scenarios which are both instances of a common safe formal pattern: one scenario involves the interaction of a provably safe pacemaker with a simulated heart; the other involves the interaction of a safe controller for patient-induced analgesia with a real syringe pump.Comment: In Proceedings RTRTS 2010, arXiv:1009.398

    Nonlinear Log-Periodogram Regression for Perturbed Fractional Processes

    Get PDF
    This paper studies fractional processes that may be perturbed by weakly dependent time series. The model for a perturbed fractional process has a components framework in which there may be components of both long and short memory. All commonly used estimates of the long memory parameter (such as log periodogram (LP) regression) may be used in a components model where the data are affected by weakly dependent perturbations, but these estimates can suffer from serious downward bias. To circumvent this problem, the present paper proposes a new procedure that allows for the possible presence of additive perturbations in the data. The new estimator resembles the LP regression estimator but involves an additional (nonlinear) term in the regression that takes account of possible perturbation effects in the data. Under some smoothness assumptions at the origin, the bias of the new estimator is shown to disappear at a faster rate than that of the LP estimator, while its asymptotic variance is inflated only by a multiplicative constant. In consequence, the optimal rate of convergence to zero of the asymptotic MSE of the new estimator is faster than that of the LP estimator. Some simulation results demonstrate the viability and the bias-reducing feature of the new estimator relative to the LP estimator in finite samples. A test for the presence of perturbations in the data is given.Asymptotic bias; Asymptotic normality; Bias reduction; Fractional components model; Perturbed fractional process; Rate of convergence; Testing perturbations

    Characterization of Risk : A Sharp Law of Large Numbers

    Get PDF
    An extensive literature in economics uses a continuum of random variables to model individual random shocks imposed on a large population. Let H denote the Hilbert space of square-integrable random variables. A key concern is to characterize the family of all H-valued functions that satisfy the law of large numbers when a large sample of agents is drawn at random. We use the iterative extension of an infinite product measure introduced in [6] to formulate a “sharp” law of large numbers. We prove that an H-valued function satisfies this law if and only if it is both Pettis-integrable and norm integrably bounded.

    Monte Carlo Simulation of Macroeconomic Risk with a Continuum Agents : The General Case

    Get PDF
    In large random economies with heterogeneous agents, a standard stochastic framework presumes a random macro state, combined with idiosyncratic micro shocks. This can be formally represented by a ran-dom process consisting of a continuum of random variables that are conditionally independent given the macro state. However, this process satisfies a standard joint measurability condition only if there is essentially no idiosyncratic risk at all. Based on iteratively complete product measure spaces, we characterize the validity of the standard stochastic framework via Monte Carlo simulation as well as event-wise measurable conditional probabilities. These general characterizations also allow us to strengthen some earlier results related to exchangeability and independence.large economy ; event-wise measurable conditional probabilities ; ex-changeability ; conditional independence ; Monte Carlo convergence ; Monte Carlo-algebra ; stochastic macro structure

    The Curious Dawn of American Public Schools

    Get PDF
    Three factors help to explain why school enrollments in the Northern United States were higher than those in the South and in most of Europe by 1850. One was affordability: the northern states had higher real incomes, cheaper teachers, and greater local tax support. The second was the greater autonomy of local governments. The third was the greater diffusion of voting power among the citizenry in much of the North, especially in rural communities. The distribution of local political voice appears to be a robust predictor of tax support and enrollments, both within and between regions.

    Optimal Bandwidth Choice for Interval Estimation in GMM Regression

    Get PDF
    In time series regression with nonparametrically autocorrelated errors, it is now standard empirical practice to construct confidence intervals for regression coefficients on the basis of nonparametrically studentized t-statistics. The standard error used in the studentization is typically estimated by a kernel method that involves some smoothing process over the sample autocovariances. The underlying parameter (M) that controls this tuning process is a bandwidth or truncation lag and it plays a key role in the finite sample properties of tests and the actual coverage properties of the associated confidence intervals. The present paper develops a bandwidth choice rule for M that optimizes the coverage accuracy of interval estimators in the context of linear GMM regression. The optimal bandwidth balances the asymptotic variance with the asymptotic bias of the robust standard error estimator. This approach contrasts with the conventional bandwidth choice rule for nonparametric estimation where the focus is the nonparametric quantity itself and the choice rule balances asymptotic variance with squared asymptotic bias. It turns out that the optimal bandwidth for interval estimation has a different expansion rate and is typically substantially larger than the optimal bandwidth for point estimation of the standard errors. The new approach to bandwidth choice calls for refined asymptotic measurement of the coverage probabilities, which are provided by means of an Edgeworth expansion of the finite sample distribution of the nonparametrically studentized t-statistic. This asymptotic expansion extends earlier work and is of independent interest. A simple plug-in procedure for implementing this optimal bandwidth is suggested and simulations confirm that the new plug-in procedure works well in finite samples. Issues of interval length and false coverage probability are also considered, leading to a secondary approach to bandwidth selection with similar properties.Asymptotic expansion, Bias, Confidence interval, Coverage probability, Edgeworth expansion, Lag kernel, Long run variance, Optimal bandwidth, Spectrum
    • 

    corecore