3,593 research outputs found

    Improvements in prevalence trend fitting and incidence estimation in EPP 2013

    Get PDF
    OBJECTIVE: Describe modifications to the latest version of the Joint United Nations Programme on AIDS (UNAIDS) Estimation and Projection Package component of Spectrum (EPP 2013) to improve prevalence fitting and incidence trend estimation in national epidemics and global estimates of HIV burden. METHODS: Key changes made under the guidance of the UNAIDS Reference Group on Estimates, Modelling and Projections include: availability of a range of incidence calculation models and guidance for selecting a model; a shift to reporting the Bayesian median instead of the maximum likelihood estimate; procedures for comparison and validation against reported HIV and AIDS data; incorporation of national surveys as an integral part of the fitting and calibration procedure, allowing survey trends to inform the fit; improved antenatal clinic calibration procedures in countries without surveys; adjustment of national antiretroviral therapy reports used in the fitting to include only those aged 15–49 years; better estimates of mortality among people who inject drugs; and enhancements to speed fitting. RESULTS: The revised models in EPP 2013 allow closer fits to observed prevalence trend data and reflect improving understanding of HIV epidemics and associated data. CONCLUSION: Spectrum and EPP continue to adapt to make better use of the existing data sources, incorporate new sources of information in their fitting and validation procedures, and correct for quantifiable biases in inputs as they are identified and understood. These adaptations provide countries with better calibrated estimates of incidence and prevalence, which increase epidemic understanding and provide a solid base for program and policy planning

    Spectrum radial velocity analyser (SERVAL). High-precision radial velocities and two alternative spectral indicators

    Full text link
    Context: The CARMENES survey is a high-precision radial velocity (RV) programme that aims to detect Earth-like planets orbiting low-mass stars. Aims: We develop least-squares fitting algorithms to derive the RVs and additional spectral diagnostics implemented in the SpEctrum Radial Velocity Analyser (SERVAL), a publicly available python code. Methods: We measured the RVs using high signal-to-noise templates created by coadding all available spectra of each star.We define the chromatic index as the RV gradient as a function of wavelength with the RVs measured in the echelle orders. Additionally, we computed the differential line width by correlating the fit residuals with the second derivative of the template to track variations in the stellar line width. Results: Using HARPS data, our SERVAL code achieves a RV precision at the level of 1m/s. Applying the chromatic index to CARMENES data of the active star YZ CMi, we identify apparent RV variations induced by stellar activity. The differential line width is found to be an alternative indicator to the commonly used full width half maximum. Conclusions: We find that at the red optical wavelengths (700--900 nm) obtained by the visual channel of CARMENES, the chromatic index is an excellent tool to investigate stellar active regions and to identify and perhaps even correct for activity-induced RV variations.Comment: 13 pages, 13 figures. A&A in press. Code is available at https://github.com/mzechmeister/serva

    Diesel Engine Emission Model Transient Cycle Validation

    Get PDF
    A control intended data driven B-spline model for NOx and soot emitted was developed and validated for the 5-cylinder, 2.4-litre Volvo passenger car diesel engine in earlier work. This work extends on the same methodology with some improvements on the model structure for more intuitive calibration and is also developed for the new generation 4-cylinder, 2- litre Volvo passenger car diesel engine. The earlier model was validated using steady state engine measurements and proposed that the model would hold good for transient engine operation. The hypothesis formulated is that a transient engine emission model can be envisioned as a sequence of multi-step steady state engine operation points with minor deviations from the nominal engine operating conditions. The theory is supported by the literature that provides more insight into the transient operation. This idea is carried out in the current work using engine test cell measurements validated for a NEDC as well as a normal road drive cycle that depicts a more transient driving behaviour in comparison to the standard emission driving cycles. Nearly 4600 engine operating points with steady state measurement including nominal and deviant conditions have been used in the development of the model. The ability of the data driven approach to mimic the engine emission generation characteristics during the engine transient operation is analysed and its superior performance in comparison to the Nominal model and the Regression model is demonstrated

    Boosting Functional Response Models for Location, Scale and Shape with an Application to Bacterial Competition

    Get PDF
    We extend Generalized Additive Models for Location, Scale, and Shape (GAMLSS) to regression with functional response. This allows us to simultaneously model point-wise mean curves, variances and other distributional parameters of the response in dependence of various scalar and functional covariate effects. In addition, the scope of distributions is extended beyond exponential families. The model is fitted via gradient boosting, which offers inherent model selection and is shown to be suitable for both complex model structures and highly auto-correlated response curves. This enables us to analyze bacterial growth in \textit{Escherichia coli} in a complex interaction scenario, fruitfully extending usual growth models.Comment: bootstrap confidence interval type uncertainty bounds added; minor changes in formulation

    A Partially Linear Framework for Massive Heterogeneous Data

    Full text link
    We consider a partially linear framework for modelling massive heterogeneous data. The major goal is to extract common features across all sub-populations while exploring heterogeneity of each sub-population. In particular, we propose an aggregation type estimator for the commonality parameter that possesses the (non-asymptotic) minimax optimal bound and asymptotic distribution as if there were no heterogeneity. This oracular result holds when the number of sub-populations does not grow too fast. A plug-in estimator for the heterogeneity parameter is further constructed, and shown to possess the asymptotic distribution as if the commonality information were available. We also test the heterogeneity among a large number of sub-populations. All the above results require to regularize each sub-estimation as though it had the entire sample size. Our general theory applies to the divide-and-conquer approach that is often used to deal with massive homogeneous data. A technical by-product of this paper is the statistical inferences for the general kernel ridge regression. Thorough numerical results are also provided to back up our theory.Comment: 40 pages main text + 40 pages suppl, To appear in Annals of Statistic

    Examining Simple Joint Macroeconomic and Term-Structure Models: A Practitioner's Perspective

    Get PDF
    The primary objective of this paper is to compare a variety of joint models of the term structure of interest rates and the macroeconomy. To this end, we consider six alternative approaches. Three of these models follow from the work of Diebold and Li (2003) with a generalization in Bolder (2006). The fourth model is a regression-based approach motivated entirely by empirical considerations. The fifth model follows from the seminal work of Ang and Piazzesi (2003), who suggest a joint macro-finance model in a discrete-time affine setting. The final model, which we term an observed-affine model, represents an adjustment to the Ang-Piazzesi model that essentially relaxes restrictions on the state-variable dynamics by making them observable. The observed-affine model is similar in spirit to work by Colin-Dufresne, Goldstein, and Jones (2005) and Cochrane and Piazzesi (2006). Using monthly Canadian data from 1973 to 2005, we compare each of these models in terms of their out-of-sample ability to forecast the transition density of zero-coupon rates. We also examine a simple approach aimed at permitting a subset of the parameters in the non-affine models to vary over time. We find, similar to Bolder (2006), that the Diebold and Li (2003) motivated approaches provide the most appealing modelling alternative across our different comparison criteria.Econometric and statistical methods; Financial Markets; Interest rates
    corecore