13,787 research outputs found

    An exact minimum variance filter for a class of discrete time systems with random parameter perturbations

    Get PDF
    An exact, closed-form minimum variance filter is designed for a class of discrete time uncertain systems which allows for both multiplicative and additive noise sources. The multiplicative noise model includes a popular class of models (Cox-Ingersoll-Ross type models) in econometrics. The parameters of the system under consideration which describe the state transition are assumed to be subject to stochastic uncertainties. The problem addressed is the design of a filter that minimizes the trace of the estimation error variance. Sensitivity of the new filter to the size of parameter uncertainty, in terms of the variance of parameter perturbations, is also considered. We refer to the new filter as the 'perturbed Kalman filter' (PKF) since it reduces to the traditional (or unperturbed) Kalman filter as the size of stochastic perturbation approaches zero. We also consider a related approximate filtering heuristic for univariate time series and we refer to filter based on this heuristic as approximate perturbed Kalman filter (APKF). We test the performance of our new filters on three simulated numerical examples and compare the results with unperturbed Kalman filter that ignores the uncertainty in the transition equation. Through numerical examples, PKF and APKF are shown to outperform the traditional (or unperturbed) Kalman filter in terms of the size of the estimation error when stochastic uncertainties are present, even when the size of stochastic uncertainty is inaccurately identified

    Topology of 2D and 3D Rational Curves

    Full text link
    In this paper we present algorithms for computing the topology of planar and space rational curves defined by a parametrization. The algorithms given here work directly with the parametrization of the curve, and do not require to compute or use the implicit equation of the curve (in the case of planar curves) or of any projection (in the case of space curves). Moreover, these algorithms have been implemented in Maple; the examples considered and the timings obtained show good performance skills.Comment: 26 pages, 19 figure

    Effectively Open Real Functions

    Get PDF
    A function f is continuous iff the PRE-image f^{-1}[V] of any open set V is open again. Dual to this topological property, f is called OPEN iff the IMAGE f[U] of any open set U is open again. Several classical Open Mapping Theorems in Analysis provide a variety of sufficient conditions for openness. By the Main Theorem of Recursive Analysis, computable real functions are necessarily continuous. In fact they admit a well-known characterization in terms of the mapping V+->f^{-1}[V] being EFFECTIVE: Given a list of open rational balls exhausting V, a Turing Machine can generate a corresponding list for f^{-1}[V]. Analogously, EFFECTIVE OPENNESS requires the mapping U+->f[U] on open real subsets to be effective. By effectivizing classical Open Mapping Theorems as well as from application of Tarski's Quantifier Elimination, the present work reveals several rich classes of functions to be effectively open.Comment: added section on semi-algebraic functions; to appear in Proc. http://cca-net.de/cca200

    Quadratic BSDEs driven by a continuous martingale and application to utility maximization problem

    Full text link
    In this paper, we study a class of quadratic Backward Stochastic Differential Equations (BSDEs) which arises naturally when studying the problem of utility maximization with portfolio constraints. We first establish existence and uniqueness results for such BSDEs and then, we give an application to the utility maximization problem. Three cases of utility functions will be discussed: the exponential, power and logarithmic ones

    A Statistical View of Learning in the Centipede Game

    Full text link
    In this article we evaluate the statistical evidence that a population of students learn about the sub-game perfect Nash equilibrium of the centipede game via repeated play of the game. This is done by formulating a model in which a player's error in assessing the utility of decisions changes as they gain experience with the game. We first estimate parameters in a statistical model where the probabilities of choices of the players are given by a Quantal Response Equilibrium (QRE) (McKelvey and Palfrey, 1995, 1996, 1998), but are allowed to change with repeated play. This model gives a better fit to the data than similar models previously considered. However, substantial correlation of outcomes of games having a common player suggests that a statistical model that captures within-subject correlation is more appropriate. Thus we then estimate parameters in a model which allows for within-player correlation of decisions and rates of learning. Through out the paper we also consider and compare the use of randomization tests and posterior predictive tests in the context of exploratory and confirmatory data analyses

    Gauss map and Lyapunov exponents of interacting particles in a billiard

    Full text link
    We show that the Lyapunov exponent (LE) of periodic orbits with Lebesgue measure zero from the Gauss map can be used to determine the main qualitative behavior of the LE of a Hamiltonian system. The Hamiltonian system is a one-dimensional box with two particles interacting via a Yukawa potential and does not possess Kolmogorov-Arnold-Moser (KAM) curves. In our case the Gauss map is applied to the mass ratio γ=m2/m1\gamma = m_2/m_1 between particles. Besides the main qualitative behavior, some unexpected peaks in the γ\gamma dependence of the mean LE and the appearance of 'stickness' in phase space can also be understand via LE from the Gauss map. This shows a nice example of the relation between the "instability" of the continued fraction representation of a number with the stability of non-periodic curves (no KAM curves) from the physical model. Our results also confirm the intuition that pseudo-integrable systems with more complicated invariant surfaces of the flow (higher genus) should be more unstable under perturbation.Comment: 13 pages, 2 figure

    Social preferences and agricultural innovation: An experimental case study from Ethiopia

    Get PDF
    We run an experiment in Ethiopia where farmers can use their own money to decrease the money of others (money burning). The data support the prediction from an inequality aversion model based on absolute income differences; but there is no support for an inequality aversion model based on comparison with mean payoff of others. Experimentally measured money burning on the village level is negatively correlated to real-life agricultural innovations. This result is robust even when data from another independent survey than the current research are used. This underscores the importance of social preferences in agricultural innovations in developing countries

    The value of source data verification in a cancer clinical trial

    Get PDF
    Background Source data verification (SDV) is a resource intensive method of quality assurance frequently used in clinical trials. There is no empirical evidence to suggest that SDV would impact on comparative treatment effect results from a clinical trial. Methods Data discrepancies and comparative treatment effects obtained following 100% SDV were compared to those based on data without SDV. Overall survival (OS) and Progression-free survival (PFS) were compared using Kaplan-Meier curves, log-rank tests and Cox models. Tumour response classifications and comparative treatment Odds Ratios (ORs) for the outcome objective response rate, and number of Serious Adverse Events (SAEs) were compared. OS estimates based on SDV data were compared against estimates obtained from centrally monitored data. Findings Data discrepancies were identified between different monitoring procedures for the majority of variables examined, with some variation in discrepancy rates. There were no systematic patterns to discrepancies and their impact was negligible on OS, the primary outcome of the trial (HR (95% CI): 1.18(0.99 to 1.41), p = 0.064 with 100% SDV; 1.18(0.99 to 1.42), p = 0.068 without SDV; 1.18(0.99 to 1.40), p = 0.073 with central monitoring). Results were similar for PFS. More extreme discrepancies were found for the subjective outcome overall objective response (OR (95% CI): 1.67(1.04 to 2.68), p = 0.03 with 100% SDV; 2.45(1.49 to 4.04), p = 0.0003 without any SDV) which was mostly due to differing CT scans. Interpretation Quality assurance methods used in clinical trials should be informed by empirical evidence. In this empirical comparison, SDV was expensive and identified random errors that made little impact on results and clinical conclusions of the trial. Central monitoring using an external data source was a more efficient approach for the primary outcome of OS. For the subjective outcome objective response, an independent blinded review committee and tracking system to monitor missing scan data could be more efficient than SDV

    Interleukin-1 polymorphisms associated with increased risk of gastric cancer

    Get PDF
    Helicobacter pylori infection is associated with a variety of clinical outcomes including gastric cancer and duodenal ulcer disease. The reasons for this variation are not clear, but the gastric physiological response is influenced by the severity and anatomical distribution of gastritis induced by H. pylori. Thus, individuals with gastritis predominantly localized to the antrum retain normal (or even high) acid secretion, whereas individuals with extensive corpus gastritis develop hypochlorhydria and gastric atrophy, which are presumptive precursors of gastric cancer. Here we report that interleukin-1 gene cluster polymorphisms suspected of enhancing production of interleukin-1-beta are associated with an increased risk of both hypochlorhydria induced by H. pylori and gastric cancer. Two of these polymorphism are in near-complete linkage disequilibrium and one is a TATA-box polymorphism that markedly affects DNA-protein interactions in vitro. The association with disease may be explained by the biological properties of interleukin-1-beta, which is an important pro-inflammatory cytokine and a powerful inhibitor of gastric acid secretion. Host genetic factors that affect interleukin-1-beta may determine why some individuals infected with H. pylori develop gastric cancer while others do no

    A reference relative time-scale as an alternative to chronological age for cohorts with long follow-up

    Get PDF
    Background: Epidemiologists have debated the appropriate time-scale for cohort survival studies; chronological age or time-on-study being two such time-scales. Importantly, assessment of risk factors may depend on the choice of time-scale. Recently, chronological or attained age has gained support but a case can be made for a ‘reference relative time-scale’ as an alternative which circumvents difficulties that arise with this and other scales. The reference relative time of an individual participant is the integral of a reference population hazard function between time of entry and time of exit of the individual. The objective here is to describe the reference relative time-scale, illustrate its use, make comparison with attained age by simulation and explain its relationship to modern and traditional epidemiologic methods. Results: A comparison was made between two models; a stratified Cox model with age as the time-scale versus an un-stratified Cox model using the reference relative time-scale. The illustrative comparison used a UK cohort of cotton workers, with differing ages at entry to the study, with accrual over a time period and with long follow-up. Additionally, exponential and Weibull models were fitted since the reference relative time-scale analysis need not be restricted to the Cox model. A simulation study showed that analysis using the reference relative time-scale and analysis using chronological age had very similar power to detect a significant risk factor and both were equally unbiased. Further, the analysis using the reference relative time-scale supported fully-parametric survival modelling and allowed percentile predictions and mortality curves to be constructed. Conclusions: The reference relative time-scale was a viable alternative to chronological age, led to simplification of the modelling process and possessed the defined features of a good time-scale as defined in reliability theory. The reference relative time-scale has several interpretations and provides a unifying concept that links contemporary approaches in survival and reliability analysis to the traditional epidemiologic methods of Poisson regression and standardised mortality ratios. The community of practitioners has not previously made this connection
    corecore