76,667 research outputs found

    Conditional Transformation Models

    Full text link
    The ultimate goal of regression analysis is to obtain information about the conditional distribution of a response given a set of explanatory variables. This goal is, however, seldom achieved because most established regression models only estimate the conditional mean as a function of the explanatory variables and assume that higher moments are not affected by the regressors. The underlying reason for such a restriction is the assumption of additivity of signal and noise. We propose to relax this common assumption in the framework of transformation models. The novel class of semiparametric regression models proposed herein allows transformation functions to depend on explanatory variables. These transformation functions are estimated by regularised optimisation of scoring rules for probabilistic forecasts, e.g. the continuous ranked probability score. The corresponding estimated conditional distribution functions are consistent. Conditional transformation models are potentially useful for describing possible heteroscedasticity, comparing spatially varying distributions, identifying extreme events, deriving prediction intervals and selecting variables beyond mean regression effects. An empirical investigation based on a heteroscedastic varying coefficient simulation model demonstrates that semiparametric estimation of conditional distribution functions can be more beneficial than kernel-based non-parametric approaches or parametric generalised additive models for location, scale and shape

    A reference relative time-scale as an alternative to chronological age for cohorts with long follow-up

    Get PDF
    Background: Epidemiologists have debated the appropriate time-scale for cohort survival studies; chronological age or time-on-study being two such time-scales. Importantly, assessment of risk factors may depend on the choice of time-scale. Recently, chronological or attained age has gained support but a case can be made for a ‘reference relative time-scale’ as an alternative which circumvents difficulties that arise with this and other scales. The reference relative time of an individual participant is the integral of a reference population hazard function between time of entry and time of exit of the individual. The objective here is to describe the reference relative time-scale, illustrate its use, make comparison with attained age by simulation and explain its relationship to modern and traditional epidemiologic methods. Results: A comparison was made between two models; a stratified Cox model with age as the time-scale versus an un-stratified Cox model using the reference relative time-scale. The illustrative comparison used a UK cohort of cotton workers, with differing ages at entry to the study, with accrual over a time period and with long follow-up. Additionally, exponential and Weibull models were fitted since the reference relative time-scale analysis need not be restricted to the Cox model. A simulation study showed that analysis using the reference relative time-scale and analysis using chronological age had very similar power to detect a significant risk factor and both were equally unbiased. Further, the analysis using the reference relative time-scale supported fully-parametric survival modelling and allowed percentile predictions and mortality curves to be constructed. Conclusions: The reference relative time-scale was a viable alternative to chronological age, led to simplification of the modelling process and possessed the defined features of a good time-scale as defined in reliability theory. The reference relative time-scale has several interpretations and provides a unifying concept that links contemporary approaches in survival and reliability analysis to the traditional epidemiologic methods of Poisson regression and standardised mortality ratios. The community of practitioners has not previously made this connection

    Normalizing Weak Boson Pair Production at the Large Hadron Collider

    Get PDF
    The production of two weak bosons at the Large Hadron Collider will be one of the most important sources of SM backgrounds for final states with multiple leptons. In this paper we consider several quantities that can help normalize the production of weak boson pairs. Ratios of inclusive cross-sections for production of two weak bosons and Drell-Yan are investigated and the corresponding theoretical errors are evaluated. The possibility of predicting the jet veto survival probability of VV production from Drell-Yan data is also considered. Overall, the theoretical errors on all quantities remain less than 5-20%. The dependence of these quantities on the center of mass energy of the proton-proton collision is also studied.Comment: 11 pages; added references, minor text revisions, version to appear in Phys. Rev.

    The contact process in heterogeneous and weakly-disordered systems

    Full text link
    The critical behavior of the contact process (CP) in heterogeneous periodic and weakly-disordered environments is investigated using the supercritical series expansion and Monte Carlo (MC) simulations. Phase-separation lines and critical exponents β\beta (from series expansion) and η\eta (from MC simulations) are calculated. A general analytical expression for the locus of critical points is suggested for the weak-disorder limit and confirmed by the series expansion analysis and the MC simulations. Our results for the critical exponents show that the CP in heterogeneous environments remains in the directed percolation (DP) universality class, while for environments with quenched disorder, the data are compatible with the scenario of continuously changing critical exponents.Comment: 5 pages, 3 figure
    corecore