3,317 research outputs found

    Sensitivity Analysis for Multiple Comparisons in Matched Observational Studies through Quadratically Constrained Linear Programming

    Full text link
    A sensitivity analysis in an observational study assesses the robustness of significant findings to unmeasured confounding. While sensitivity analyses in matched observational studies have been well addressed when there is a single outcome variable, accounting for multiple comparisons through the existing methods yields overly conservative results when there are multiple outcome variables of interest. This stems from the fact that unmeasured confounding cannot affect the probability of assignment to treatment differently depending on the outcome being analyzed. Existing methods allow this to occur by combining the results of individual sensitivity analyses to assess whether at least one hypothesis is significant, which in turn results in an overly pessimistic assessment of a study's sensitivity to unobserved biases. By solving a quadratically constrained linear program, we are able to perform a sensitivity analysis while enforcing that unmeasured confounding must have the same impact on the treatment assignment probabilities across outcomes for each individual in the study. We show that this allows for uniform improvements in the power of a sensitivity analysis not only for testing the overall null of no effect, but also for null hypotheses on \textit{specific} outcome variables while strongly controlling the familywise error rate. We illustrate our method through an observational study on the effect of smoking on naphthalene exposure

    Fitting theories of nuclear binding energies

    Full text link
    In developing theories of nuclear binding energy such as density-functional theory, the effort required to make a fit can be daunting due to the large number of parameters that may be in the theory and the large number of nuclei in the mass table. For theories based on the Skyrme interaction, the effort can be reduced considerably by using the singular value decomposition to reduce the size of the parameter space. We find that the sensitive parameters define a space of dimension four or so, and within this space a linear refit is adequate for a number of Skyrme parameters sets from the literature. We do not find marked differences in the quality of the fit between the SLy4, the Bky4 and SkP parameter sets. The r.m.s. residual error in even-even nuclei is about 1.5 MeV, half the value of the liquid drop model. We also discuss an alternative norm for evaluating mass fits, the Chebyshev norm. It focuses attention on the cases with the largest discrepancies between theory and experiment. We show how it works with the liquid drop model and make some applications to models based on Skyrme energy functionals. The Chebyshev norm seems to be more sensitive to new experimental data than the root-mean-square norm. The method also has the advantage that candidate improvements to the theories can be assessed with computations on smaller sets of nuclei.Comment: 17 pages and 4 figures--version encorporates referee's comment

    Statistical inference optimized with respect to the observed sample for single or multiple comparisons

    Full text link
    The normalized maximum likelihood (NML) is a recent penalized likelihood that has properties that justify defining the amount of discrimination information (DI) in the data supporting an alternative hypothesis over a null hypothesis as the logarithm of an NML ratio, namely, the alternative hypothesis NML divided by the null hypothesis NML. The resulting DI, like the Bayes factor but unlike the p-value, measures the strength of evidence for an alternative hypothesis over a null hypothesis such that the probability of misleading evidence vanishes asymptotically under weak regularity conditions and such that evidence can support a simple null hypothesis. Unlike the Bayes factor, the DI does not require a prior distribution and is minimax optimal in a sense that does not involve averaging over outcomes that did not occur. Replacing a (possibly pseudo-) likelihood function with its weighted counterpart extends the scope of the DI to models for which the unweighted NML is undefined. The likelihood weights leverage side information, either in data associated with comparisons other than the comparison at hand or in the parameter value of a simple null hypothesis. Two case studies, one involving multiple populations and the other involving multiple biological features, indicate that the DI is robust to the type of side information used when that information is assigned the weight of a single observation. Such robustness suggests that very little adjustment for multiple comparisons is warranted if the sample size is at least moderate.Comment: Typo in equation (7) of v2 corrected in equation (6) of v3; clarity improve

    Distribution-Free Learning

    Get PDF
    We select among rules for learning which of two actions in a stationary decision problem achieves a higher expected payo¤when payoffs realized by both actions are known in previous instances. Only a bounded set containing all possible payoffs is known. Rules are evaluated using maximum risk with maximin utility, minimax regret, competitive ratio and selection procedures being special cases. A randomized variant of fictitious play attains minimax risk for all risk functions with ex-ante expected payoffs increasing in the number of observations. Fictitious play itself has neither of these two properties. Tight bounds on maximal regret and probability of selecting the best action are included.fictitious play, nonparametric, finite sample, matched pairs, foregone payoffs, minimax risk, ex-ante improving, selection procedure

    Japanese Monetary Policy during the Collapse of the Bubble Economy: A View of Policymaking under Uncertainty

    Get PDF
    Focusing on policymaking under uncertainty, we analyze the monetary policy of the Bank of Japan (BOJ) in the early 1990s, when the bubble economy collapsed. Conducting stochastic simulations with a large- scale macroeconomic model of the Japanese economy, we find that the BOJf s monetary policy at that time was essentially optimal under uncertainty about the policy multiplier. On the other hand, we also find that the BOJ's policy was not optimal under uncertainty about inflation dynamics, and that a more aggressive policy response than actually implemented would have been needed. Thus, optimal monetary policy differs greatly depending upon which type of uncertainty is emphasized. Taking into account the fact that overcoming deflation became an important issue from the latter 1990s, it is possible to argue that during the early 1990s the BOJ should have placed greater emphasis on uncertainty about inflation dynamics and implemented a more aggressive monetary policy. The result from a counterfactual simulation indicates that the inflation rate and the real growth rate would have been higher to some extent if the BOJ had implemented a more accommodative policy during the early 1990s. However, the simulation result also suggests that the effects would have been limited, and that an accommodative monetary policy itself would not have changed the overall image of the prolonged stagnation of the Japanese economy during the 1990s.Collapse of the bubble economy; Monetary policy; Uncertainty
    • …
    corecore