7,210 research outputs found
Bayesian splines versus fractional polynomials in network meta-analysis
BACKGROUND: Network meta-analysis (NMA) provides a powerful tool for the simultaneous evaluation of multiple treatments by combining evidence from different studies, allowing for direct and indirect comparisons between treatments. In recent years, NMA is becoming increasingly popular in the medical literature and underlying statistical methodologies are evolving both in the frequentist and Bayesian framework. Traditional NMA models are often based on the comparison of two treatment arms per study. These individual studies may measure outcomes at multiple time points that are not necessarily homogeneous across studies. METHODS: In this article we present a Bayesian model based on B-splines for the simultaneous analysis of outcomes across time points, that allows for indirect comparison of treatments across different longitudinal studies. RESULTS: We illustrate the proposed approach in simulations as well as on real data examples available in the literature and compare it with a model based on P-splines and one based on fractional polynomials, showing that our approach is flexible and overcomes the limitations of the latter. CONCLUSIONS: The proposed approach is computationally efficient and able to accommodate a large class of temporal treatment effect patterns, allowing for direct and indirect comparisons of widely varying shapes of longitudinal profiles
Bayesian Autoregressive Frailty Models for Inference in Recurrent Events
We propose autoregressive Bayesian semi-parametric models for gap times between recurrent events. The aim is two-fold: inference on the effect of possibly time-varying covariates on the gap times and clustering of individuals based on the time trajectory of the recurrent event. Time-dependency between gap times is taken into account through the specification of an autoregressive component for the frailty parameters influencing the response at different times. The order of the autoregression may be assumed unknown and is an object of inference. We consider two alternative approaches to perform model selection under this scenario. Covariates may be easily included in the regression framework and censoring and missing data are easily accounted for. As the proposed methodologies lie within the class of Dirichlet process mixtures, posterior inference can be performed through efficient MCMC algorithms. We illustrate the approach through simulations and medical applications involving recurrent hospitalizations of cancer patients and successive urinary tract infections
Bayesian Deconvolution and Quantification of Metabolites from J-Resolved NMR Spectroscopy
Two-dimensional (2D) nuclear magnetic resonance (nmr) methods have become increasingly popular in metabolomics, since they have considerable potential to accurately identify and quantify metabolites within complex biological samples. 2D
1
H J-resolved (jres) nmr spectroscopy is a widely used method that expands overlapping resonances into a second dimension. However, existing analytical processing methods do not fully exploit the information in the jres spectrum and, more importantly, do not provide measures of uncertainty associated with the estimates of quantities of interest, such as metabolite concentration. Combining the data-generating mechanisms and the extensive prior knowledge available in online databases, we develop a Bayesian method to analyse 2D jres data, which allows for automatic deconvolution, identification and quantification of metabolites. The model extends and improves previous work on one-dimensional nmr spectral data. Our approach is based on a combination of B-spline tight wavelet frames and theoretical templates, and thus enables the automatic incorporation of expert knowledge within the inferential framework. Posterior inference is performed through specially devised Markov chain Monte Carlo methods. We demonstrate the performance of our approach via analyses of datasets from serum and urine, showing the advantages of our proposed approach in terms of identification and quantification of metabolites
About the certification of railway rails
When the compliance with the European Code of some rail steel has to be verified, the need of carrying out the experimental activities in accordance with several testing Standards forces the operator both to solve the problems related to the choice of a suitable testing practice and often to interpret subjectively Standards guidelines. This does not facilitate the comparability and/or the quality of the results produced by several laboratories. With reference to a series of fatigue, fracture toughness and fatigue crack growth tests carried out by the authors on specimens extracted from rails, the main lacks in the current standards, related to both the choice of the control parameters and the testing procedures, are pointed out. Regarding the crack growth testing, several procedures to compute the crack growth rates to be compared with the limits prescribed by the Code are proposed. These procedures have been applied to a data set produced during the aforementioned testing activity, in order to highlight, by comparison of the results obtained by them, the significant differences in the crack growth rate estimates and the magnitude of the errors that can be done due to the lacks in the standard practices currently adopted
About the certification of railway rails
When the compliance with the European Code of some rail steel has to be verified, the need ofcarrying out the experimental activities in accordance with several testing Standards forces the operator both tosolve the problems related to the choice of a suitable testing practice and often to interpret subjectivelyStandards guidelines. This does not facilitate the comparability and/or the quality of the results produced byseveral laboratories. With reference to a series of fatigue, fracture toughness and fatigue crack growth testscarried out by the authors on specimens extracted from rails, the main lacks in the current standards, related toboth the choice of the control parameters and the testing procedures, are pointed out. Regarding the crackgrowth testing, several procedures to compute the crack growth rates to be compared with the limits prescribedby the Code are proposed. These procedures have been applied to a data set produced during theaforementioned testing activity, in order to highlight, by comparison of the results obtained by them, thesignificant differences in the crack growth rate estimates and the magnitude of the errors that can be done dueto the lacks in the standard practices currently adopted
On the Possibility of Measuring the Gravitomagnetic Clock Effect in an Earth Space-Based Experiment
In this paper the effect of the post-Newtonian gravitomagnetic force on the
mean longitudes of a pair of counter-rotating Earth artificial satellites
following almost identical circular equatorial orbits is investigated. The
possibility of measuring it is examined. The observable is the difference of
the times required to in passing from 0 to 2 for both senses of
motion. Such gravitomagnetic time shift, which is independent of the orbital
parameters of the satellites, amounts to 5 s for Earth; it is
cumulative and should be measured after a sufficiently high number of
revolutions. The major limiting factors are the unavoidable imperfect
cancellation of the Keplerian periods, which yields a constraint of 10
cm in knowing the difference between the semimajor axes of the satellites,
and the difference of the inclinations of the orbital planes which, for
, should be less than . A pair of spacecrafts
endowed with a sophisticated intersatellite tracking apparatus and drag-free
control down to 10 cm s Hz level might allow to meet
the stringent requirements posed by such a mission.Comment: LaTex2e, 22 pages, no tables, 1 figure, 38 references. Final version
accepted for publication in Classical and Quantum Gravit
Bandgap widening and resonator mass reduction through wave locking
Elastic metamaterials made of locally resonant arrays have been developed as effective ways to create band gaps for elastic or acoustic travelling waves. They work by implementing stationary states in the structure that localise and partially reflect waves. A different, simpler, way of obtaining band gaps is using phononic crystals, where the generated band gaps come from the periodic reflection and phase cancellation of travelling waves. In this work a different metamaterial structure that generates band gaps by means of coupling two contra-propagating modes is reported. This metamaterial, as it will be shown numerically and experimentally, generates larger band gaps with lower added mass, providing benefits for lighter structures
A three-parameter model for fatigue crack growth data analysis
A three-parameters model for the interpolation of fatigue crack propagation data is proposed. It has been validated by a Literature data set obtained by testing 180 M(T) specimens under three different loading levels. In details, it is highlighted that the results of the analysis carried out by means of the proposed model are more smooth and clear than those obtainable using other methods or models. Also, the parameters of the model have been computed and some peculiarities have been picked out
Bayesian clustering of multiple zero-inflated outcomes
Several applications involving counts present a large proportion of zeros (excess-of-zeros data). A popular model for such data is the hurdle model, which explicitly models the probability of a zero count, while assuming a sampling distribution on the positive integers. We consider data from multiple count processes. In this context, it is of interest to study the patterns of counts and cluster the subjects accordingly. We introduce a novel Bayesian approach to cluster multiple, possibly related, zero-inflated processes. We propose a joint model for zero-inflated counts, specifying a hurdle model for each process with a shifted Negative Binomial sampling distribution. Conditionally on the model parameters, the different processes are assumed independent, leading to a substantial reduction in the number of parameters as compared with traditional multivariate approaches. The subject-specific probabilities of zero-inflation and the parameters of the sampling distribution are flexibly modelled via an enriched finite mixture with random number of components. This induces a two-level clustering of the subjects based on the zero/non-zero patterns (outer clustering) and on the sampling distribution (inner clustering). Posterior inference is performed through tailored Markov chain Monte Carlo schemes. We demonstrate the proposed approach on an application involving the use of the messaging service WhatsApp. This article is part of the theme issue 'Bayesian inference: challenges, perspectives, and prospects'
- …