7,418 research outputs found

    Bayesian model selection for testing the no-hair theorem with black hole ringdowns

    Full text link
    General relativity predicts that a black hole that results from the merger of two compact stars (either black holes or neutron stars) is initially highly deformed but soon settles down to a quiescent state by emitting a superposition of quasi-normal modes (QNMs). The QNMs are damped sinusoids with characteristic frequencies and decay times that depend only on the mass and spin of the black hole and no other parameter - a statement of the no-hair theorem. In this paper we have examined the extent to which QNMs could be used to test the no-hair theorem with future ground- and space-based gravitational-wave detectors. We model departures from general relativity (GR) by introducing extra parameters which change the mode frequencies or decay times from their general relativistic values. With the aid of numerical simulations and Bayesian model selection, we assess the extent to which the presence of such a parameter could be inferred, and its value estimated. We find that it is harder to decipher the departure of decay times from their GR value than it is with the mode frequencies. Einstein Telescope (ET, a third generation ground-based detector) could detect departures of <1% in the frequency of the dominant QNM mode of a 500 Msun black hole, out to a maximum range of 4 Gpc. In contrast, the New Gravitational Observatory (NGO, an ESA space mission to detect gravitational waves) can detect departures of ~ 0.1% in a 10^8 Msun black hole to a luminosity distance of 30 Gpc (z = 3.5).Comment: 9 pages, 5 figure

    Calculating and understanding the value of any type of match evidence when there are potential testing errors

    Get PDF
    It is well known that Bayes’ theorem (with likelihood ratios) can be used to calculate the impact of evidence, such as a ‘match’ of some feature of a person. Typically the feature of interest is the DNA profile, but the method applies in principle to any feature of a person or object, including not just DNA, fingerprints, or footprints, but also more basic features such as skin colour, height, hair colour or even name. Notwithstanding concerns about the extensiveness of databases of such features, a serious challenge to the use of Bayes in such legal contexts is that its standard formulaic representations are not readily understandable to non-statisticians. Attempts to get round this problem usually involve representations based around some variation of an event tree. While this approach works well in explaining the most trivial instance of Bayes’ theorem (involving a single hypothesis and a single piece of evidence) it does not scale up to realistic situations. In particular, even with a single piece of match evidence, if we wish to incorporate the possibility that there are potential errors (both false positives and false negatives) introduced at any stage in the investigative process, matters become very complex. As a result we have observed expert witnesses (in different areas of speciality) routinely ignore the possibility of errors when presenting their evidence. To counter this, we produce what we believe is the first full probabilistic solution of the simple case of generic match evidence incorporating both classes of testing errors. Unfortunately, the resultant event tree solution is too complex for intuitive comprehension. And, crucially, the event tree also fails to represent the causal information that underpins the argument. In contrast, we also present a simple-to-construct graphical Bayesian Network (BN) solution that automatically performs the calculations and may also be intuitively simpler to understand. Although there have been multiple previous applications of BNs for analysing forensic evidence—including very detailed models for the DNA matching problem, these models have not widely penetrated the expert witness community. Nor have they addressed the basic generic match problem incorporating the two types of testing error. Hence we believe our basic BN solution provides an important mechanism for convincing experts—and eventually the legal community—that it is possible to rigorously analyse and communicate the full impact of match evidence on a case, in the presence of possible error

    A TWO-STEP ESTIMATOR FOR A SPATIAL LAG MODEL OF COUNTS: THEORY, SMALL SAMPLE PERFORMANCE AND AN APPLICATION

    Get PDF
    Several spatial econometric approaches are available to model spatially correlated disturbances in count models, but there are at present no structurally consistent count models incorporating spatial lag autocorrelation. A two-step, limited information maximum likelihood estimator is proposed to fill this gap. The estimator is developed assuming a Poisson distribution, but can be extended to other count distributions. The small sample properties of the estimator are evaluated with Monte Carlo experiments. Simulation results suggest that the spatial lag count estimator achieves gains in terms of bias over the aspatial version as spatial lag autocorrelation and sample size increase. An empirical example deals with the location choice of single-unit start-up firms in the manufacturing industry in the US between 2000 and 2004. The empirical results suggest that in the dynamic process of firm formation, counties dominated by firms exhibiting (internal) increasing returns to scale are at a relative disadvantage even if localization economies are presentcount model, location choice, manufacturing, Poisson, spatial econometrics

    Gravitational waves: search results, data analysis and parameter estimation

    Get PDF
    The Amaldi 10 Parallel Session C2 on gravitational wave (GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity

    Model Selection for Longitudinal Data With Time-Dependent Covariates Using Generalized Method of Moments

    Get PDF
    The purpose of this dissertation was to establish measures that could be used to assess the relative fit of nested models with parameters estimated using the Generalized Method of Moments for longitudinal data with time-dependent covariates. A secondary data set collected from Filipino children was used as an example of model fitting to evaluate the quality of the assessment of fit of the Kullback-Leibler Information Criterion (KLIC) and a chi-squared statistic derived from the difference in the minimums of the quadratic forms of two candidate nested models. A simulation involving randomly-generated data sets was also used to evaluate the performance of the proposed statistics. Several variations of nested models were considered in the simulation, and the KLIC was used to compare the relative fit of these models. Overall, the performance of the KLIC as a model selection criterion showed that it achieved good detection proportion in identifying the correct model when it was compared to underfit models. On the contrary, it tended to favor overfit models over the correct model, and non-detection proportions were high when extraneous predictors were introduced to candidate models. Ignoring the feedback loop introduced by time-varying covariates and relying on the regular use of the Generalized Estimating Equations (GEE) for the analysis of longitudinal data could compromise model parameter consistency, efficiency, and bias resulting in misleading inferences. Replacing the former practice with the routine use of GMM to properly account for feedback in the data is highly encouraged. The KLIC would be a helpful tool to select an appropriate model among a collection of candidate GMM models, especially when there are time-varying predictors in the data

    A statistical framework for joint eQTL analysis in multiple tissues

    Get PDF
    Mapping expression Quantitative Trait Loci (eQTLs) represents a powerful and widely-adopted approach to identifying putative regulatory variants and linking them to specific genes. Up to now eQTL studies have been conducted in a relatively narrow range of tissues or cell types. However, understanding the biology of organismal phenotypes will involve understanding regulation in multiple tissues, and ongoing studies are collecting eQTL data in dozens of cell types. Here we present a statistical framework for powerfully detecting eQTLs in multiple tissues or cell types (or, more generally, multiple subgroups). The framework explicitly models the potential for each eQTL to be active in some tissues and inactive in others. By modeling the sharing of active eQTLs among tissues this framework increases power to detect eQTLs that are present in more than one tissue compared with "tissue-by-tissue" analyses that examine each tissue separately. Conversely, by modeling the inactivity of eQTLs in some tissues, the framework allows the proportion of eQTLs shared across different tissues to be formally estimated as parameters of a model, addressing the difficulties of accounting for incomplete power when comparing overlaps of eQTLs identified by tissue-by-tissue analyses. Applying our framework to re-analyze data from transformed B cells, T cells and fibroblasts we find that it substantially increases power compared with tissue-by-tissue analysis, identifying 63% more genes with eQTLs (at FDR=0.05). Further the results suggest that, in contrast to previous analyses of the same data, the majority of eQTLs detectable in these data are shared among all three tissues.Comment: Summitted to PLoS Genetic

    The European Enlargement Process and Regional Convergence Revisited: Spatial Effects Still Matter.

    Get PDF
    This paper has two main goals. First, it reconsiders regional growth and convergence processes in the context of the enlargement of the European Union to new member states. We show that spatial autocorrelation and heterogeneity still matter in a sample of 237 regions over the period 1993-2002. Spatial convergence clubs are defined using exploratory spatial data analysis and a spatial autoregressive model is estimated. We find strong evidence that the growth rate of per capita GDP for a given region is positively affected by the growth rate of neighbouring regions. The second objective is to test the robustness of the results with respect to non-normality, outliers and heteroskedasticity using two other methods: The quasi maximum Likelihood and the Bayesian estimation methods.
    • 

    corecore