27 research outputs found

    Improving tree mortality models by accounting for environmental influences

    Get PDF
    Tree-ring chronologies have been widely used in studies of tree mortality where variables of recent growth act as an indicator of tree physiological vigour. Comparing recent radial growth of live and dead trees thus allows estimating probabilities of tree mortality. Sampling of mature dead trees usually provides death-year distributions that may span over years or decades. Recent growth of dead trees (prior to death) is then computed during a number of periods, whereas recent growth (prior to sampling) for live trees is computed for identical periods. Because recent growth of live and dead trees is then computed for different periods, external factors such as disturbance or climate may influence growth rates and, thus, mortality probability estimations. To counteract this problem, we propose the truncating of live-growth series to obtain similar frequency distributions of the "last year of growth" for the populations of live and dead trees. In this paper, we use different growth scenarios from several tree species, from several geographic sources, and from trees with different growth patterns to evaluate the impact of truncating on predictor variables and their selection in logistic regression analysis. Also, we assess the ability of the resulting models to accurately predict the status of trees through internal and external validation. Our results suggest that the truncating of live-growth series helps decrease the influence of external factors on growth comparisons. By doing so, it reinforces the growth-vigour link of the mortality model and enhances the model's accuracy as well as its general applicability. Hence, if model parameters are to be integrated in simulation models of greater geographical extent, truncating may be used to increase model robustness

    Fitting a Normal Distribution When the Model is Wrong

    No full text
    Local likelihood, semi-parametric inference, robust estimation, model misspecification,

    The Application of Rule-Based Methods to Class Prediction Problems in Genomics

    Full text link
    We propose a method for constructing classifiers using logical combinations of elementary rules. The method is a form of rule-based classification, which has been widely discussed in the literature. In this work we focus specifically on issues that arise in the context of classifying cell samples based on RNA or protein expression measurements. The basic idea is to specify elementary rules that exhibit a locally strong pattern in favor of a single class. Strict admissibility criteria are imposed to produce a manageable universe of elementary rules. Then the elementary rules are combined using a set covering algorithm to form a composite rule that achieves a perfect fit to the training data. The user has explicit control over a parameter that determines the composite rule's level of redundancy and parsimony. This built-in control, along with the simplicity of interpreting the rules, makes the method particularly useful for classification problems in genomics. We demonstrate the new method using several microarray datasets and examine its generalization performance. We also draw comparisons to other machine-learning strategies such as CART, ID3, and C4.5.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/63431/1/106652703322539033.pd

    Statistical prediction methods with particular reference to the social sciences

    No full text
    SIGLELD:8318.172(SSRC-HR--4916) / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    Predicting reoffending for discretionary conditional release

    No full text
    SIGLEAvailable from British Library Document Supply Centre-DSC:4326.11(150) / BLDSC - British Library Document Supply CentreGBUnited Kingdo
    corecore