5,397 research outputs found

    Asymptotic Properties of Approximate Bayesian Computation

    Get PDF
    Approximate Bayesian computation allows for statistical analysis in models with intractable likelihoods. In this paper we consider the asymptotic behaviour of the posterior distribution obtained by this method. We give general results on the rate at which the posterior distribution concentrates on sets containing the true parameter, its limiting shape, and the asymptotic distribution of the posterior mean. These results hold under given rates for the tolerance used within the method, mild regularity conditions on the summary statistics, and a condition linked to identification of the true parameters. Implications for practitioners are discussed.Comment: This 31 pages paper is a revised version of the paper, including supplementary materia

    Knowledge Practices In Professional Web Design

    Get PDF
    This study examines the use and construction of knowledge by individuals involved in the professional design of websites. Its focus is on the knowledge practices of those who identify as web designers; professionals engaged primarily in the aesthetic design of websites. This study employs a qualitative semi-structured interview to explore this population’s practices, and adopts a constructivist approach built on critical realist ontology in analyzing the data. The study addresses the general lack of scholarship focusing on web designers, and helps build an understanding of the processes and forces that govern the development and creation of websites. The findings of this study show how knowledge is created and used, through understanding the practices around the discovery, sharing, and use of information and knowledge by participants. Employing the knowledge lens, the study provides details about not just knowledge and information, but about the way knowledge is used actively in the creative enterprise of study participants

    IN EVENT OF AN (AI) EMERGENCY: INTERPRETING CONTINUITY OF GOVERNMENT PROVISIONS IN STATE CONSTITUTIONS

    Get PDF
    “Of this I am certain: If we prepare ourselves so that a terrible attack—although it might hurt us—could not destroy us, then such an attack will never come.” - Edward Teller, the “Father of the Hydrogen Bomb,” in an interview with Allen Brown of This Week Magazine in 1957. Bad actors have already used or may soon use AI to disrupt critical infrastructure, influence elections, and upend economies. Those most concerned about the risks posed by AI argue that it is a matter of when and not if state governments will have to respond to threatened or realized acts of AI aggression. Though a litany of scholars have examined the powers governors may use in emergency situations, less attention has been paid to the role of state legislatures in responding to destabilizing events. Scholars have justified their focus on governors for practical reasons—the executive branch of state governments has been deemed the “the center of governmental response[s]” to public emergencies. Two trends caution against perpetuating neglect of state legislatures. First, the legal and social bases for governors to take sweeping action in response to emergencies eroded in many states during COVID-19. In turn, many state legislatures, by law, by popular support, or both, have amassed more authority to respond in worst-case scenarios. Second, the likelihood of states being thrown into disarray will only increase as AI evolves and spreads; thus, warranting a closer analysis of what powers state legislatures may exercise to restore normalcy. Thirty-five state constitutions contain variants of a template “Continuity of Government” (CoG) provision promulgated by the federal government at the height of the Cold War. What events may trigger these provisions, as well as what powers they afford to state legislatures, has evaded judicial scrutiny as a result of state legislatures rarely invoking the relevant provision. It follows that the scholarly analysis of how best to interpret these important provisions should occur in the relative tranquility of the present rather than at the height of a calamity. This preemptive analysis may improve the ability of state legislatures to respond to disorder by clarifying the likely scope and duration of their powers and, ideally, by spurring amendments to clarify the provisions in advance of any such event. This paper serves as one (and, likely, the first) entry in an inquiry that merits immediate and robust scholarly attention. Relying on the framework set forth by the New Haven School of Jurisprudence, this paper resolves one of the most consequential ambiguities contained in CoG provisions. This framework deserves special consideration given its inclusion of myriad disciplines and its characterization as an “explicitly policy-oriented jurisprudence.” Scholars from across the legal profession have a role in contributing to this inquiry. The incorporation of AI into legal practice imposes a responsibility on scholars to anticipate how the technology may require new doctrines, laws, and methods of interpretation. Though this paper focuses on the continuation of state governments in the wake of an AI emergency, related inquiries such as how to rethink contract law, property law, and the like upon such an emergency demand more scholarly attention. The exploration of those topics can, in turn, inform what sorts of powers state legislatures may need to exercise and for how long

    Indirect inference : which moments to match?

    Get PDF
    The standard approach to indirect inference estimation considers that the auxiliary parameters, which carry the identifying information about the structural parameters of interest, are obtained from some recently identified vector of estimating equations. In contrast to this standard interpretation, we demonstrate that the case of overidentified auxiliary parameters is both possible, and, indeed, more commonly encountered than one may initially realize. We then revisit the “moment matching” and “parameter matching” versions of indirect inference in this context and devise efficient estimation strategies in this more general framework. Perhaps surprisingly, we demonstrate that if one were to consider the naive choice of an efficient Generalized Method of Moments (GMM)-based estimator for the auxiliary parameters, the resulting indirect inference estimators would be inefficient. In this general context, we demonstrate that efficient indirect inference estimation actually requires a two-step estimation procedure, whereby the goal of the first step is to obtain an efficient version of the auxiliary model. These two-step estimators are presented both within the context of moment matching and parameter matching. View Full-Tex

    A Comparison of Likelihood-Free Methods With and Without Summary Statistics

    Full text link
    Likelihood-free methods are useful for parameter estimation of complex models with intractable likelihood functions for which it is easy to simulate data. Such models are prevalent in many disciplines including genetics, biology, ecology and cosmology. Likelihood-free methods avoid explicit likelihood evaluation by finding parameter values of the model that generate data close to the observed data. The general consensus has been that it is most efficient to compare datasets on the basis of a low dimensional informative summary statistic, incurring information loss in favour of reduced dimensionality. More recently, researchers have explored various approaches for efficiently comparing empirical distributions in the likelihood-free context in an effort to avoid data summarisation. This article provides a review of these full data distance based approaches, and conducts the first comprehensive comparison of such methods, both qualitatively and empirically. We also conduct a substantive empirical comparison with summary statistic based likelihood-free methods. The discussion and results offer guidance to practitioners considering a likelihood-free approach. Whilst we find the best approach to be problem dependent, we also find that the full data distance based approaches are promising and warrant further development. We discuss some opportunities for future research in this space

    Auxiliary Likelihood-Based Approximate Bayesian Computation in State Space Models

    Get PDF
    A computationally simple approach to inference in state space models is proposed, using approximate Bayesian computation (ABC). ABC avoids evaluation of an intractable likelihood by matching summary statistics for the observed data with statistics computed from data simulated from the true process, based on parameter draws from the prior. Draws that produce a 'match' between observed and simulated summaries are retained, and used to estimate the inaccessible posterior. With no reduction to a low-dimensional set of sufficient statistics being possible in the state space setting, we define the summaries as the maximum of an auxiliary likelihood function, and thereby exploit the asymptotic sufficiency of this estimator for the auxiliary parameter vector. We derive conditions under which this approach - including a computationally efficient version based on the auxiliary score - achieves Bayesian consistency. To reduce the well-documented inaccuracy of ABC in multi-parameter settings, we propose the separate treatment of each parameter dimension using an integrated likelihood technique. Three stochastic volatility models for which exact Bayesian inference is either computationally challenging, or infeasible, are used for illustration. We demonstrate that our approach compares favorably against an extensive set of approximate and exact comparators. An empirical illustration completes the paper.Comment: This paper is forthcoming at the Journal of Computational and Graphical Statistics. It also supersedes the earlier arXiv paper "Approximate Bayesian Computation in State Space Models" (arXiv:1409.8363

    Balancing proliferation and connectivity in PTEN -associated Autism Spectrum Disorder

    Get PDF
    Germline mutations in PTEN, which encodes a widely expressed phosphatase, was mapped to 10q23 and identified as the susceptibility gene for Cowden syndrome, characterized by macrocephaly and high risks of breast, thyroid, and other cancers. The phenotypic spectrum of PTEN mutations expanded to include autism with macrocephaly only 10 years ago. Neurological studies of patients with PTEN-associated autism spectrum disorder (ASD) show increases in cortical white matter and a distinctive cognitive profile, including delayed language development with poor working memory and processing speed. Once a germline PTEN mutation is found, and a diagnosis of phosphatase and tensin homolog (PTEN) hamartoma tumor syndrome made, the clinical outlook broadens to include higher lifetime risks for multiple cancers, beginning in childhood with thyroid cancer. First described as a tumor suppressor, PTEN is a major negative regulator of the phosphatidylinositol 3-kinase/protein kinase B/mammalian target of rapamycin (mTOR) signaling pathway—controlling growth, protein synthesis, and proliferation. This canonical function combines with less well-understood mechanisms to influence synaptic plasticity and neuronal cytoarchitecture. Several excellent mouse models of Pten loss or dysfunction link these neural functions to autism-like behavioral abnormalities, such as altered sociability, repetitive behaviors, and phenotypes like anxiety that are often associated with ASD in humans. These models also show the promise of mTOR inhibitors as therapeutic agents capable of reversing phenotypes ranging from overgrowth to low social behavior. Based on these findings, therapeutic options for patients with PTEN hamartoma tumor syndrome and ASD are coming into view, even as new discoveries in PTEN biology add complexity to our understanding of this master regulator
    • …
    corecore