321,038 research outputs found

    Error estimation and reduction with cross correlations

    Full text link
    Besides the well-known effect of autocorrelations in time series of Monte Carlo simulation data resulting from the underlying Markov process, using the same data pool for computing various estimates entails additional cross correlations. This effect, if not properly taken into account, leads to systematically wrong error estimates for combined quantities. Using a straightforward recipe of data analysis employing the jackknife or similar resampling techniques, such problems can be avoided. In addition, a covariance analysis allows for the formulation of optimal estimators with often significantly reduced variance as compared to more conventional averages.Comment: 16 pages, RevTEX4, 4 figures, 6 tables, published versio

    ANN-based energy reconstruction procedure for TACTIC gamma-ray telescope and its comparison with other conventional methods

    Full text link
    The energy estimation procedures employed by different groups, for determining the energy of the primary γ\gamma-ray using a single atmospheric Cherenkov imaging telescope, include methods like polynomial fitting in SIZE and DISTANCE, general least square fitting and look-up table based interpolation. A novel energy reconstruction procedure, based on the utilization of Artificial Neural Network (ANN), has been developed for the TACTIC atmospheric Cherenkov imaging telescope. The procedure uses a 3:30:1 ANN configuration with resilient backpropagation algorithm to estimate the energy of a γ\gamma-ray like event on the basis of its image SIZE, DISTANCE and zenith angle. The new ANN-based energy reconstruction method, apart from yielding an energy resolution of \sim 26%, which is comparable to that of other single imaging telescopes, has the added advantage that it considers zenith angle dependence as well. Details of the ANN-based energy estimation procedure along with its comparative performance with other conventional energy reconstruction methods are presented in the paper and the results indicate that amongst all the methods considered in this work, ANN method yields the best results. The performance of the ANN-based energy reconstruction has also been validated by determining the energy spectrum of the Crab Nebula in the energy range 1-16 TeV, as measured by the TACTIC telescope.Comment: 23pages, 9 figures Accepted for publication in NIM

    Analysis of judgmental adjustments in the presence of promotions

    Get PDF
    Sales forecasting is increasingly complex due to many factors, such as product life cycles that have become shorter, more competitive markets and aggressive marketing. Often, forecasts are produced using a Forecasting Support System that integrates univariate statistical forecasts with judgment from experts in the organization. Managers add information to the forecast, like future promotions, potentially improving accuracy. Despite the importance of judgment and promotions, the literature devoted to study their relationship on forecasting performance is scarce. We analyze managerial adjustments accuracy under periods of promotions, based on weekly data from a manufacturing company. Intervention analysis is used to establish whether judgmental adjustments can be replaced by multivariate statistical models when responding to promotional information. We show that judgmental adjustments can enhance baseline forecasts during promotions, but not systematically. Transfer function models based on past promotions information achieved lower overall forecasting errors. Finally, a hybrid model illustrates the fact that human experts still added value to the transfer function models

    Estimation of a regression spline sample selection model

    Get PDF
    It is often the case that an outcome of interest is observed for a restricted non-randomly selected sample of the population. In such a situation, standard statistical analysis yields biased results. This issue can be addressed using sample selection models which are based on the estimation of two regressions: a binary selection equation determining whether a particular statistical unit will be available in the outcome equation. Classic sample selection models assume a priori that continuous regressors have a pre-specified linear or non-linear relationship to the outcome, which can lead to erroneous conclusions. In the case of continuous response, methods in which covariate effects are modeled flexibly have been previously proposed, the most recent being based on a Bayesian Markov chain Monte Carlo approach. A frequentist counterpart which has the advantage of being computationally fast is introduced. The proposed algorithm is based on the penalized likelihood estimation framework. The construction of confidence intervals is also discussed. The empirical properties of the existing and proposed methods are studied through a simulation study. The approaches are finally illustrated by analyzing data from the RAND Health Insurance Experiment on annual health expenditures

    A new approach to hierarchical data analysis: Targeted maximum likelihood estimation for the causal effect of a cluster-level exposure

    Full text link
    We often seek to estimate the impact of an exposure naturally occurring or randomly assigned at the cluster-level. For example, the literature on neighborhood determinants of health continues to grow. Likewise, community randomized trials are applied to learn about real-world implementation, sustainability, and population effects of interventions with proven individual-level efficacy. In these settings, individual-level outcomes are correlated due to shared cluster-level factors, including the exposure, as well as social or biological interactions between individuals. To flexibly and efficiently estimate the effect of a cluster-level exposure, we present two targeted maximum likelihood estimators (TMLEs). The first TMLE is developed under a non-parametric causal model, which allows for arbitrary interactions between individuals within a cluster. These interactions include direct transmission of the outcome (i.e. contagion) and influence of one individual's covariates on another's outcome (i.e. covariate interference). The second TMLE is developed under a causal sub-model assuming the cluster-level and individual-specific covariates are sufficient to control for confounding. Simulations compare the alternative estimators and illustrate the potential gains from pairing individual-level risk factors and outcomes during estimation, while avoiding unwarranted assumptions. Our results suggest that estimation under the sub-model can result in bias and misleading inference in an observational setting. Incorporating working assumptions during estimation is more robust than assuming they hold in the underlying causal model. We illustrate our approach with an application to HIV prevention and treatment
    corecore