121 research outputs found

    Approximation of Bayesian Hawkes process with inlabru

    Get PDF
    Hawkes process are very popular mathematical tools for modeling phenomena exhibiting a self-exciting or self-correcting behavior. Typical examples are earthquakes occurrence, wild-fires, drought, capture-recapture, crime violence, trade exchange, and social network activity. The widespread use of Hawkes process in different fields calls for fast, reproducible, reliable, easy-to-code techniques to implement such models. We offer a technique to perform approximate Bayesian inference of Hawkes process parameters based on the use of the R-package inlabru . The inlabru R-package, in turn, relies on the INLA methodology to approximate the posterior of the parameters. Our Hawkes process approximation is based on a decomposition of the log-likelihood in three parts, which are linearly approximated separately. The linear approximation is performed with respect to the mode of the parameters\u27 posterior distribution, which is determined with an iterative gradient-based method. The approximation of the posterior parameters is therefore deterministic, ensuring full reproducibility of the results. The proposed technique only requires the user to provide the functions to calculate the different parts of the decomposed likelihood, which are internally linearly approximated by the R-package inlabru . We provide a comparison with the bayesianETAS R-package which is based on an MCMC method. The two techniques provide similar results but our approach requires two to ten times less computational time to converge, depending on the amount of data

    Bayesian modelling of the temporal evolution of seismicity using the ETAS.inlabru package

    Get PDF
    The epidemic type aftershock sequence (ETAS) model is widely used to model seismic sequences and underpins operational earthquake forecasting (OEF). However, it remains challenging to assess the reliability of inverted ETAS parameters for numerous reasons. For example, the most common algorithms just return point estimates with little quantification of uncertainty. At the same time, Bayesian Markov chain Monte Carlo implementations remain slow to run and do not scale well, and few have been extended to include spatial structure. This makes it difficult to explore the effects of stochastic uncertainty. Here, we present a new approach to ETAS modeling using an alternative Bayesian method, the integrated nested Laplace approximation (INLA). We have implemented this model in a new R-Package called ETAS.inlabru, which is built on the R packages R-INLA and inlabru. Our study has included extending these packages, which provided tools for modeling log-Gaussian Cox processes, to include the self-exciting Hawkes process that ETAS is a special case of. While we just present the temporal component here, the model scales to a spatio-temporal model and may include a variety of spatial covariates. This is a fast method that returns joint posteriors on the ETAS background and triggering parameters. Using a series of synthetic case studies, we explore the robustness of ETAS inversions using this method of inversion. We also included runnable notebooks to reproduce the figures in this article as part of the package\u27s GitHub repository. We demonstrate that reliable estimates of the model parameters require that the catalog data contain periods of relative quiescence, as well as triggered sequences. We explore the robustness of the method under stochastic uncertainty in the training data and show that the method is robust to a wide range of starting conditions. We show how the inclusion of historic earthquakes prior to the modeled time window affects the quality of the inversion. Finally, we show that rate-dependent incompleteness of earthquake catalogs after large earthquakes have a significant and detrimental effect on the ETAS posteriors. We believe that the speed of the inlabru inversion, which includes a rigorous estimation of uncertainty, will enable a deeper exploration of how to use ETAS robustly for seismicity modeling and operational earthquake forecasting

    Ranking earthquake forecasts using proper scoring rules: binary events in a low probability environment

    Get PDF
    Operational earthquake forecasting for risk management and communication during seismic sequences depends on our ability to select an optimal forecasting model. To do this, we need to compare the performance of competing models in prospective experiments, and to rank their performance according to the outcome using a fair, reproducible and reliable method, usually in a low-probability environment. The Collaboratory for the Study of Earthquake Predictability conducts prospective earthquake forecasting experiments around the globe. In this framework, it is crucial that the metrics used to rank the competing forecasts are ‘proper’, meaning that, on average, they prefer the data generating model. We prove that the Parimutuel Gambling score, proposed, and in some cases applied, as a metric for comparing probabilistic seismicity forecasts, is in general ‘improper’. In the special case where it is proper, we show it can still be used improperly. We demonstrate the conclusions both analytically and graphically providing a set of simulation based techniques that can be used to assess if a score is proper or not. They only require a data generating model and, at least two forecasts to be compared. We compare the Parimutuel Gambling score’s performance with two commonly used proper scores (the Brier and logarithmic scores) using confidence intervals to account for the uncertainty around the observed score difference. We suggest that using confidence intervals enables a rigorous approach to distinguish between the predictive skills of candidate forecasts, in addition to their rankings. Our analysis shows that the Parimutuel Gambling score is biased, and the direction of the bias depends on the forecasts taking part in the experiment. Our findings suggest the Parimutuel Gambling score should not be used to distinguishing between multiple competing forecasts, and for care to be taken in the case where only two are being compared

    Structural biology and phylogenetic estimation

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/62633/1/388527a0.pd

    Long-term declines in ADLs, IADLs, and mobility among older Medicare beneficiaries

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Most prior studies have focused on short-term (≤ 2 years) functional declines. But those studies cannot address aging effects inasmuch as all participants have aged the same amount. Therefore, the authors studied the extent of long-term functional decline in older Medicare beneficiaries who were followed for varying time lengths, and the authors also identified the risk factors associated with those declines.</p> <p>Methods</p> <p>The analytic sample included 5,871 self- or proxy-respondents who had complete baseline and follow-up survey data that could be linked to their Medicare claims for 1993-2007. Functional status was assessed using activities of daily living (ADLs), instrumental ADLs (IADLs), and mobility limitations, with declines defined as the development of two of more new difficulties. Multiple logistic regression analysis was used to focus on the associations involving respondent status, health lifestyle, continuity of care, managed care status, health shocks, and terminal drop.</p> <p>Results</p> <p>The average amount of time between the first and final interviews was 8.0 years. Declines were observed for 36.6% on ADL abilities, 32.3% on IADL abilities, and 30.9% on mobility abilities. Functional decline was more likely to occur when proxy-reports were used, and the effects of baseline function on decline were reduced when proxy-reports were used. Engaging in vigorous physical activity consistently and substantially protected against functional decline, whereas obesity, cigarette smoking, and alcohol consumption were only associated with mobility declines. Post-baseline hospitalizations were the most robust predictors of functional decline, exhibiting a dose-response effect such that the greater the average annual number of hospital episodes, the greater the likelihood of functional status decline. Participants whose final interview preceded their death by one year or less had substantially greater odds of functional status decline.</p> <p>Conclusions</p> <p>Both the additive and interactive (with functional status) effects of respondent status should be taken into consideration whenever proxy-reports are used. Encouraging exercise could broadly reduce the risk of functional decline across all three outcomes, although interventions encouraging weight reduction and smoking cessation would only affect mobility declines. Reducing hospitalization and re-hospitalization rates could also broadly reduce the risk of functional decline across all three outcomes.</p

    Dentifrices, mouthwashes, and remineralization/caries arrestment strategies

    Get PDF
    While our knowledge of the dental caries process and its prevention has greatly advanced over the past fifty years, it is fair to state that the management of this disease at the level of the individual patient remains largely empirical. Recommendations for fluoride use by patients at different levels of caries risk are mainly based on the adage that more is better. There is a general understanding that the fluoride compound, concentration, frequency of use, duration of exposure, and method of delivery can influence fluoride efficacy. Two important factors are (1) the initial interaction of relatively high concentrations of fluoride with the tooth surface and plaque during application and (2) the retention of fluoride in oral fluids after application

    The LUX-ZEPLIN (LZ) Experiment

    Get PDF
    We describe the design and assembly of the LUX-ZEPLIN experiment, a direct detection search for cosmic WIMP dark matter particles. The centerpiece of the experiment is a large liquid xenon time projection chamber sensitive to low energy nuclear recoils. Rejection of backgrounds is enhanced by a Xe skin veto detector and by a liquid scintillator Outer Detector loaded with gadolinium for efficient neutron capture and tagging. LZ is located in the Davis Cavern at the 4850' level of the Sanford Underground Research Facility in Lead, South Dakota, USA. We describe the major subsystems of the experiment and its key design features and requirements
    corecore