1,031 research outputs found

    40Ar/39Ar ages of lunar impact glasses: Relationships among Ar diffusivity, chemical composition, shape, and size

    Get PDF
    Lunar impact glasses, quenched melts produced during cratering events on the Moon, have the potential to provide not only compositional information about both the local and regional geology of the Moon but also information about the impact flux over time. We present in this paper the results of 73 new 40Ar/39Ar analyses of well-characterized, inclusion-free lunar impact glasses and demonstrate that size, shape, chemical composition, fraction of radiogenic 40Ar retained, and cosmic ray exposure (CRE) ages are important for 40Ar/39Ar investigations of these samples. Specifically, analyses of lunar impact glasses from the Apollo 14, 16, and 17 landing sites indicate that retention of radiogenic 40Ar is a strong function of post-formation thermal history in the lunar regolith, size, and chemical composition. Based on the relationships presented in this paper, lunar impact glasses with compositions and sizes sufficient to have retained 90% of their radiogenic Ar during 750 Ma of cosmic ray exposure at time-integrated temperatures of up to 290 K have been identified and are likely to have yielded reliable 40Ar/39Ar ages of formation. Additionally, ~50% of the identified impact glass spheres have formation ages of <500 Ma, while ~75% of the identified lunar impact glass shards and spheres have ages of formation <2000 Ma. The observed age-frequency distribution of lunar impact glasses may reflect two processes: (i) diminished preservation due to spontaneous shattering with age; and (ii) preservation of a remnant population of impact glasses from the tail end of the terminal lunar bombardment having 40Ar/39Ar ages up to 3800 Ma. A protocol is described for selecting and analyzing lunar impact glasses.Comment: Please contact Zellner ([email protected]) for data tables and other supplemental informatio

    Harold Jeffreys's Theory of Probability Revisited

    Full text link
    Published exactly seventy years ago, Jeffreys's Theory of Probability (1939) has had a unique impact on the Bayesian community and is now considered to be one of the main classics in Bayesian Statistics as well as the initiator of the objective Bayes school. In particular, its advances on the derivation of noninformative priors as well as on the scaling of Bayes factors have had a lasting impact on the field. However, the book reflects the characteristics of the time, especially in terms of mathematical rigor. In this paper we point out the fundamental aspects of this reference work, especially the thorough coverage of testing problems and the construction of both estimation and testing noninformative priors based on functional divergences. Our major aim here is to help modern readers in navigating in this difficult text and in concentrating on passages that are still relevant today.Comment: This paper commented in: [arXiv:1001.2967], [arXiv:1001.2968], [arXiv:1001.2970], [arXiv:1001.2975], [arXiv:1001.2985], [arXiv:1001.3073]. Rejoinder in [arXiv:0909.1008]. Published in at http://dx.doi.org/10.1214/09-STS284 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Mindfulness-Based Cognitive Approach for Seniors (MBCAS): Program Development and Implementation

    Get PDF
    # The Author(s) 2013. This article is published with open access at Springerlink.com Abstract A number of cognitive interventions have been developed to enhance cognitive functioning in the growing population of the elderly. We describe the Mindfulness-Based Cognitive Approach for Seniors (MBCAS), a new training program designed especially for seniors. It was conceived in the context of self-development for seniors who wish to enhance their relationship with their inner and outer selves in order to navigate their aging process more easily and fluently. Physical and psychosocial problems related to aging, as well as some temporal issues, were taken into account in developing this program. Unlike clinically oriented mindfulness-based programs, which are generally delivered during an 8-week period, the MBCAS training program is presented over a period of 8 months. The main objectives of this program are to teach seniors to observe current experi-ences with nonjudgmental awareness, to identify automatic behaviors or reactions to current experiences that are poten-tially nonadaptive, and to enhance and reinforce positive coping with typical difficulties that they face in their daily lives. Details of the program development and initial imple-mentation are presented, with suggestions for evaluating the program&apos;s effectiveness

    Instrumental Variables, Errors in Variables, and Simultaneous Equations Models: Applicability and Limitations of Direct Monte Carlo

    Get PDF
    A Direct Monte Carlo (DMC) approach is introduced for posterior simulation in the Instrumental Variables (IV) model with one possibly endogenous regressor, multiple instruments and Gaussian errors under a flat prior. This DMC method can also be applied in an IV model (with one or multiple instruments) under an informative prior for the endogenous regressor's effect. This DMC approach can not be applied to more complex IV models or Simultaneous Equations Models with multiple endogenous regressors. An Approximate DMC (ADMC) approach is introduced that makes use of the proposed Hybrid Mixture Sampling (HMS) method, which facilitates Metropolis-Hastings (MH) or Importance Sampling from a proper marginal posterior density with highly non-elliptical shapes that tend to infinity for a point of singularity. After one has simulated from the irregularly shaped marginal distri- bution using the HMS method, one easily samples the other parameters from their conditional Student-t and Inverse-Wishart posteriors. An example illustrates the close approximation and high MH acceptance rate. While using a simple candidate distribution such as the Student-t may lead to an infinite variance of Importance Sampling weights. The choice between the IV model and a simple linear model un- der the restriction of exogeneity may be based on predictive likelihoods, for which the efficient simulation of all model parameters may be quite useful. In future work the ADMC approach may be extended to more extensive IV models such as IV with non-Gaussian errors, panel IV, or probit/logit IV

    Earth-Moon Impacts at ~300 Ma and ~500 Ma Ago

    Get PDF
    Impact events have played an important role in the evolution of planets and small bodies in the Solar System. Meteorites, lunar melt rocks, and lunar impact glasses provide important information about the geology of the parent body and the age of the impacting episodes. Over 2400 impact glasses from 4 Apollo regolith samples have been geochemically analyzed and a subset has been dated by the (40)Ar/(39)Ar method. New results, consistent with 2 break-ups in the Asteroid Belt, are presented here. Our previous study reported that (40)Ar/(39)Ar ages from 9 impact glasses showed that the Moon experienced significant impacts at approx. 800 Ma and at approx. 3800 Ma ago, somewhere in the vicinity of the Apollo 16 landing site. Additionally, reported on Apollo 12 samples with ages around 800 Ma, together implying global bombardment events. New data on 7 glasses from regolith sample 66041,127 show that the Moon also experienced impact events at approx. 300 Ma and > 500 Ma ago, which may coincide with the break-ups in the Asteroid Belt of the L- and H-chrondrite parent bodies. Since meteoritic evidence for these breakups has been found on Earth, it follows that evidence should be found in lunar samples as well. Additional information is included in the original extended abstract

    Maximum Entropy and Bayesian Data Analysis: Entropic Priors

    Full text link
    The problem of assigning probability distributions which objectively reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.Comment: 23 pages, 2 figure

    An Adaptive Interacting Wang-Landau Algorithm for Automatic Density Exploration

    Full text link
    While statisticians are well-accustomed to performing exploratory analysis in the modeling stage of an analysis, the notion of conducting preliminary general-purpose exploratory analysis in the Monte Carlo stage (or more generally, the model-fitting stage) of an analysis is an area which we feel deserves much further attention. Towards this aim, this paper proposes a general-purpose algorithm for automatic density exploration. The proposed exploration algorithm combines and expands upon components from various adaptive Markov chain Monte Carlo methods, with the Wang-Landau algorithm at its heart. Additionally, the algorithm is run on interacting parallel chains -- a feature which both decreases computational cost as well as stabilizes the algorithm, improving its ability to explore the density. Performance is studied in several applications. Through a Bayesian variable selection example, the authors demonstrate the convergence gains obtained with interacting chains. The ability of the algorithm's adaptive proposal to induce mode-jumping is illustrated through a trimodal density and a Bayesian mixture modeling application. Lastly, through a 2D Ising model, the authors demonstrate the ability of the algorithm to overcome the high correlations encountered in spatial models.Comment: 33 pages, 20 figures (the supplementary materials are included as appendices

    Dose escalation improves therapeutic outcome: post hoc analysis of data from a 12-week, multicentre, double-blind, parallel-group trial of trospium chloride in patients with urinary urge incontinence

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Flexible dosing of anticholinergics used for overactive bladder (OAB) treatment is a useful strategy in clinical practice for achieving a maximum effective and maximum tolerated level of therapeutic benefit. In this post hoc analysis we evaluated the efficacy and tolerability of trospium chloride treatment for urinary urge incontinence (UUI) with focus on flexible dosing.</p> <p>Methods</p> <p>The data came from a 12-week, randomised, double-blind, phase IIIb study in which 1658 patients with urinary frequency plus urge incontinence received trospium chloride 15 mg TID (n = 828) or 2.5 mg oxybutynin hydrochloride TID (n = 830). After four weeks, daily doses were doubled and not readjusted in 29.2% (242/828) of patients in the trospium group, and in 23.3% (193/830) in the oxybuytnin group, until the end of treatment. We assessed the absolute reduction in weekly UUI episodes and the change in intensity of dry mouth, recorded in patients' micturition diaries. Adverse events were also evaluated. Statistics were descriptive.</p> <p>Results</p> <p>Dose escalation of either trospium or oxybutynin increased reduction in UUI episodes in the population studied. At study end, there were no relevant differences between the "dose adjustment" subgroups and the respective "no dose adjustment" subgroups (trospium: <it>P </it>= 0.249; oxybutynin: <it>P </it>= 0.349). After dose escalation, worsening of dry mouth was higher in both dose adjusted subgroups compared to the respective "no dose adjustment" subgroups (<it>P </it>< 0.001). Worsening of dry mouth was lower in the trospium groups than in the oxybutynin groups (<it>P </it>< 0.001). Adverse events were increased in the dose adjusted subgroups.</p> <p>Conclusions</p> <p>Flexible dosing of trospium was proven to be as effective, but better tolerated as the officially approved adjusted dose of oxybutynin.</p> <p>Trial registration (parent study)</p> <p>The study was registered with the German Federal Institute for Drugs and Medical Devices (BfArM, Berlin, Germany), registration number 4022383, as required at the time point of planning this study.</p

    Simulation study for analysis of binary responses in the presence of extreme case problems

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Estimates of variance components for binary responses in presence of extreme case problems tend to be biased due to an under-identified likelihood. The bias persists even when a normal prior is used for the fixed effects.</p> <p>Methods</p> <p>A simulation study was carried out to investigate methods for the analysis of binary responses with extreme case problems. A linear mixed model that included a fixed effect and random effects of sire and residual on the liability scale was used to generate binary data. Five simulation scenarios were conducted based on varying percentages of extreme case problems, with true values of heritability equal to 0.07 and 0.17. Five replicates of each dataset were generated and analyzed with a generalized prior (<b>g-prior</b>) of varying weight.</p> <p>Results</p> <p>Point estimates of sire variance using a normal prior were severely biased when the percentage of extreme case problems was greater than 30%. Depending on the percentage of extreme case problems, the sire variance was overestimated when a normal prior was used by 36 to 102% and 25 to 105% for a heritability of 0.17 and 0.07, respectively. When a g-prior was used, the bias was reduced and even eliminated, depending on the percentage of extreme case problems and the weight assigned to the g-prior. The lowest Pearson correlations between true and estimated fixed effects were obtained when a normal prior was used. When a 15% g-prior was used instead of a normal prior with a heritability equal to 0.17, Pearson correlations between true and fixed effects increased by 11, 20, 23, 27, and 60% for 5, 10, 20, 30 and 75% of extreme case problems, respectively. Conversely, Pearson correlations between true and estimated fixed effects were similar, within datasets of varying percentages of extreme case problems, when a 5, 10, or 15% g-prior was included. Therefore this indicates that a model with a g-prior provides a more adequate estimation of fixed effects.</p> <p>Conclusions</p> <p>The results suggest that when analyzing binary data with extreme case problems, bias in the estimation of variance components could be eliminated, or at least significantly reduced by using a g-prior.</p
    corecore