532 research outputs found

    Eliciting Dirichlet and Gaussian copula prior distributions for multinomial models

    Get PDF
    In this paper, we propose novel methods of quantifying expert opinion about prior distributions for multinomial models. Two different multivariate priors are elicited using median and quartile assessments of the multinomial probabilities. First, we start by eliciting a univariate beta distribution for the probability of each category. Then we elicit the hyperparameters of the Dirichlet distribution, as a tractable conjugate prior, from those of the univariate betas through various forms of reconciliation using least-squares techniques. However, a multivariate copula function will give a more flexible correlation structure between multinomial parameters if it is used as their multivariate prior distribution. So, second, we use beta marginal distributions to construct a Gaussian copula as a multivariate normal distribution function that binds these marginals and expresses the dependence structure between them. The proposed method elicits a positive-definite correlation matrix of this Gaussian copula. The two proposed methods are designed to be used through interactive graphical software written in Java

    Green Sturgeon Physical Habitat Use in the Coastal Pacific Ocean

    Get PDF
    The green sturgeon (Acipenser medirostris) is a highly migratory, oceanic, anadromous species with a complex life history that makes it vulnerable to species-wide threats in both freshwater and at sea. Green sturgeon population declines have preceded legal protection and curtailment of activities in marine environments deemed to increase its extinction risk. Yet, its marine habitat is poorly understood. We built a statistical model to characterize green sturgeon marine habitat using data from a coastal tracking array located along the Siletz Reef near Newport, Oregon, USA that recorded the passage of 37 acoustically tagged green sturgeon. We classified seafloor physical habitat features with high-resolution bathymetric and backscatter data. We then described the distribution of habitat components and their relationship to green sturgeon presence using ordination and subsequently used generalized linear model selection to identify important habitat components. Finally, we summarized depth and temperature recordings from seven green sturgeon present off the Oregon coast that were fitted with pop-off archival geolocation tags. Our analyses indicated that green sturgeon, on average, spent a longer duration in areas with high seafloor complexity, especially where a greater proportion of the substrate consists of boulders. Green sturgeon in marine habitats are primarily found at depths of 20–60 meters and from 9.5–16.0°C. Many sturgeon in this study were likely migrating in a northward direction, moving deeper, and may have been using complex seafloor habitat because it coincides with the distribution of benthic prey taxa or provides refuge from predators. Identifying important green sturgeon marine habitat is an essential step towards accurately defining the conditions that are necessary for its survival and will eventually yield range-wide, spatially explicit predictions of green sturgeon distribution

    Bayesian inference for the information gain model

    Get PDF
    One of the most popular paradigms to use for studying human reasoning involves the Wason card selection task. In this task, the participant is presented with four cards and a conditional rule (e.g., “If there is an A on one side of the card, there is always a 2 on the other side”). Participants are asked which cards should be turned to verify whether or not the rule holds. In this simple task, participants consistently provide answers that are incorrect according to formal logic. To account for these errors, several models have been proposed, one of the most prominent being the information gain model (Oaksford & Chater, Psychological Review, 101, 608–631, 1994). This model is based on the assumption that people independently select cards based on the expected information gain of turning a particular card. In this article, we present two estimation methods to fit the information gain model: a maximum likelihood procedure (programmed in R) and a Bayesian procedure (programmed in WinBUGS). We compare the two procedures and illustrate the flexibility of the Bayesian hierarchical procedure by applying it to data from a meta-analysis of the Wason task (Oaksford & Chater, Psychological Review, 101, 608–631, 1994). We also show that the goodness of fit of the information gain model can be assessed by inspecting the posterior predictives of the model. These Bayesian procedures make it easy to apply the information gain model to empirical data. Supplemental materials may be downloaded along with this article from www.springerlink.com

    Frailty in primary care: a review of its conceptualization and implications for practice

    Get PDF
    Frail, older patients pose a challenge to the primary care physician who may often feel overwhelmed by their complex presentation and tenuous health status. At the same time, family physicians are ideally suited to incorporate the concept of frailty into their practice. They have the propensity and skill set that lends itself to patient-centred care, taking into account the individual subtleties of the patient's health within their social context. Tools to identify frailty in the primary care setting are still in the preliminary stages of development. Even so, some practical measures can be taken to recognize frailty in clinical practice and begin to address how its recognition may impact clinical care. This review seeks to address how frailty is recognised and managed, especially in the realm of primary care

    Apprenticeship Training in Germany – Investment or Productivity Driven?

    Full text link
    The German dual apprenticeship system came under pressure in recent years because enterprises were not willing to offer a sufficient number of apprenticeship positions. A frequently made argument is that the gap could be closed if more firms would be willing to incur net costs during the training period. This paper investigates for the first time whether German enterprises on average indeed incur net costs during the apprenticeship period, i.e. if the impact of an increase in the share of apprentices on contemporary profits is negative. The paper uses the representative linked employer-employee panel data of the IAB (LIAB) and takes into account possible endogeneity of training intensity and unobserved heterogeneity in the profit estimation by employing panel system GMM methods. An increase in the share of apprentices has no effect on profits. This can be interpreted as a first indication that most establishments in Germany do not invest more in apprentices than their productivity effects during the apprenticeship period

    Standardized and reproducible methodology for the comprehensive and systematic assessment of surgical resection margins during breast-conserving surgery for invasive breast cancer

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The primary goal of breast-conserving surgery (BCS) is to completely excise the tumor and achieve "adequate" or "negative" surgical resection margins while maintaining an acceptable level of postoperative cosmetic outcome. Nevertheless, precise determination of the adequacy of BCS has long been debated. In this regard, the aim of the current paper was to describe a standardized and reproducible methodology for comprehensive and systematic assessment of surgical resection margins during BCS.</p> <p>Methods</p> <p>Retrospective analysis of 204 BCS procedures performed for invasive breast cancer from August 2003 to June 2007, in which patients underwent a standard BCS resection and systematic sampling of nine standardized re-resection margins (superior, superior-medial, superior-lateral, medial, lateral, inferior, inferior-medial, inferior-lateral, and deep-posterior). Multiple variables (including patient, tumor, specimen, and follow-up variables) were evaluated.</p> <p>Results</p> <p>6.4% (13/204) of patients had positive BCS specimen margins (defined as tumor at inked edge of BCS specimen) and 4.4% (9/204) of patients had close margins (defined as tumor within 1 mm or less of inked edge but not at inked edge of BCS specimen). 11.8% (24/204) of patients had at least one re-resection margin containing additional disease, independent of the status of the BCS specimen margins. 7.1% (13/182) of patients with negative BCS specimen margins (defined as no tumor cells seen within 1 mm or less of inked edge of BCS specimen) had at least one re-resection margin containing additional disease. Thus, 54.2% (13/24) of patients with additional disease in a re-resection margin would not have been recognized by a standard BCS procedure alone (P < 0.001). The nine standardized resection margins represented only 26.8% of the volume of the BCS specimen and 32.6% of the surface area of the BCS specimen.</p> <p>Conclusion</p> <p>Our methodology accurately assesses the adequacy of surgical resection margins for determination of which individuals may need further resection to the affected breast in order to minimize the potential risk of local recurrence while attempting to limit the volume of additional breast tissue excised, as well as to determine which individuals are not realistically amendable to BCS and instead need a completion mastectomy to successfully remove multifocal disease.</p

    Measurement of cross sections for production of a Z boson in association with a flavor-inclusive or doubly b-tagged large-radius jet in proton-proton collisions at Formula Presented with the ATLAS experiment

    Get PDF
    We present measurements of cross sections for production of a leptonically decaying Z boson in association with a large-radius jet in 13 TeV proton-proton collisions at the LHC, using 36 fb - 1 of data from the ATLAS detector. Integrated and differential cross sections are measured at particle level in both a flavor inclusive and a doubly b -tagged fiducial phase space. The large-radius jet mass and transverse momentum, its kinematic relationship to the Z boson, and the angular separation of b -tagged small-radius track jets within the large-radius jet are measured. This measurement constitutes an important test of perturbative quantum chromodynamics in kinematic and flavor configurations relevant to several Higgs boson and beyond-Standard-Model physics analyses. The results highlight issues with modeling of additional hadronic activity in the flavor-inclusive selection, and a distinction between flavor-number schemes in the b -tagged phase space

    Measurement of substructure-dependent jet suppression in Pb+Pb collisions at 5.02 TeV with the ATLAS detector

    Get PDF
    The ATLAS detector at the Large Hadron Collider has been used to measure jet substructure modification and suppression in Pb+Pb collisions at a nucleon–nucleon center-of-mass energy √sNN = 5.02 TeV in comparison with proton–proton (pp) collisions at √s = 5.02 TeV. The Pb+Pb data, collected in 2018, have an integrated luminosity of 1.72 nb−1, while the ppdata, collected in 2017, have an integrated luminosity of 260 pb−1. Jets used in this analysis are clustered using the anti-kt algorithm with a radius parameter R = 0.4. The jet constituents, defined by both tracking and calorimeter information, are used to determine the angular scale rg of the first hard splitting inside the jet by reclustering them using the Cambridge–Aachen algorithm and employing the soft-drop grooming technique. The nuclear modification factor, RAA, used to characterize jet suppression in Pb+Pb collisions, is presented differentially in rg, jet transverse momentum, and in intervals of collision centrality. The RAA value is observed to depend significantly on jet rg. Jets produced with the largest measured rg are found to be twice as suppressed as those with the smallest rg in central Pb+Pb collisions. The RAA values do not exhibit a strong variation with jet pT in any of the rg intervals. The rg and pT dependence of jet RAA is qualitatively consistent with a picture of jet quenching arising from coherence and provides the most direct evidence in support of this approach

    Anomaly detection search for new resonances decaying into a Higgs boson and a generic new particle X in hadronic final states using Formula Presented pp collisions with the ATLAS detector

    Get PDF
    A search is presented for a heavy resonance Formula Presented decaying into a Standard Model Higgs boson Formula Presented and a new particle Formula Presented in a fully hadronic final state. The full Large Hadron Collider run 2 dataset of proton-proton collisions at Formula Presented collected by the ATLAS detector from 2015 to 2018 is used and corresponds to an integrated luminosity of Formula Presented. The search targets the high Formula Presented-mass region, where the Formula Presented and Formula Presented have a significant Lorentz boost in the laboratory frame. A novel application of anomaly detection is used to define a general signal region, where events are selected solely because of their incompatibility with a learned background-only model. It is constructed using a jet-level tagger for signal-model-independent selection of the boosted Formula Presented particle, representing the first application of fully unsupervised machine learning to an ATLAS analysis. Two additional signal regions are implemented to target a benchmark Formula Presented decay into two quarks, covering topologies where the Formula Presented is reconstructed as either a single large-radius jet or two small-radius jets. The analysis selects Higgs boson decays into Formula Presented, and a dedicated neural-network-based tagger provides sensitivity to the boosted heavy-flavor topology. No significant excess of data over the expected background is observed, and the results are presented as upper limits on the production cross section Formula Presented) for signals with Formula Presented between 1.5 and 6 TeV and Formula Presented between 65 and 3000 GeV. A search is presented for a heavy resonance Y decaying into a Standard Model Higgs boson H and a new particle X in a fully hadronic final state. The full Large Hadron Collider run 2 dataset of proton-proton collisions at √ s = 13     TeV collected by the ATLAS detector from 2015 to 2018 is used and corresponds to an integrated luminosity of 139     fb − 1 . The search targets the high Y -mass region, where the H and X have a significant Lorentz boost in the laboratory frame. A novel application of anomaly detection is used to define a general signal region, where events are selected solely because of their incompatibility with a learned background-only model. It is constructed using a jet-level tagger for signal-model-independent selection of the boosted X particle, representing the first application of fully unsupervised machine learning to an ATLAS analysis. Two additional signal regions are implemented to target a benchmark X decay into two quarks, covering topologies where the X is reconstructed as either a single large-radius jet or two small-radius jets. The analysis selects Higgs boson decays into b ¯ b , and a dedicated neural-network-based tagger provides sensitivity to the boosted heavy-flavor topology. No significant excess of data over the expected background is observed, and the results are presented as upper limits on the production cross section σ ( p p → Y → X H → q ¯ q b ¯ b ) for signals with m Y between 1.5 and 6 TeV and m X between 65 and 3000 GeV
    corecore