286 research outputs found

    On Ill-Conditioned Generalized Estimating Equations and Toward Unified Biased Estimation

    Get PDF
    I address the issue of ill-conditioned regressors within generalized estimating equations (GEEs). In such a setting, standard GEE approaches can have problems with: convergence, large coefficient variances, poor prediction, deflated power of tests, and in some extreme cases, e.g. functional regressors, may not even exist. I modify the quasi-likelihood score functions, while presenting a variety of biased estimators that simultaneously address the issues of (severe) ill-conditioning and correlated response variables. To simplify the presentation, I attempt to unite or link these estimators as much as possible. Some properties, as well as some guidelines for choosing the meta or penalty parameters are suggested

    Varying Coefficient Tensor Models for Brain Imaging

    Get PDF
    We revisit a multidimensional varying-coefficient model (VCM), by allowing regressor coefficients to vary smoothly in more than one dimension, thereby extending the VCM of Hastie and Tibshirani. The motivating example is 3-dimensional, involving a special type of nuclear magnetic resonance measurement technique that is being used to estimate the diffusion tensor at each point in the human brain. We aim to improve the current state of the art, which is to apply a multiple regression model for each voxel separately using information from six or more volume images. We present a model, based on P-spline tensor products, to introduce spatial smoothness of the estimated diffusion tensor. Since the regression design matrix is space-invariant, a 4-dimensional tensor product model results, allowing more efficient computation with penalized array regression

    Space-Varying Coefficient Models for Brain Imaging

    Get PDF
    The methodological development and the application in this paper originate from diffusion tensor imaging (DTI), a powerful nuclear magnetic resonance technique enabling diagnosis and monitoring of several diseases as well as reconstruction of neural pathways. We reformulate the current analysis framework of separate voxelwise regressions as a 3d space-varying coefficient model (VCM) for the entire set of DTI images recorded on a 3d grid of voxels. Hence by allowing to borrow strength from spatially adjacent voxels, to smooth noisy observations, and to estimate diffusion tensors at any location within the brain, the three-step cascade of standard data processing is overcome simultaneously. We conceptualize two VCM variants based on B-spline basis functions: a full tensor product approach and a sequential approximation, rendering the VCM numerically and computationally feasible even for the huge dimension of the joint model in a realistic setup. A simulation study shows that both approaches outperform the standard method of voxelwise regressions with subsequent regularization. Due to major efficacy, we apply the sequential method to a clinical DTI data set and demonstrate the inherent ability of increasing the rigid grid resolution by evaluating the incorporated basis functions at intermediate points. In conclusion, the suggested fitting methods clearly improve the current state-of-the-art, but ameloriation of local adaptivity remains desirable

    Twenty years of P-splines

    Get PDF
    P-splines first appeared in the limelight twenty years ago. Since then they have become popular in applications and in theoretical work. The combination of a rich B-spline basis and a simple difference penalty lends itself well to a variety of generalizations, because it is based on regression. In effect, P-splines allow the building of a “backbone” for the “mixing and matching” of a variety of additive smooth structure components, while inviting all sorts of extensions: varying-coefficient effects, signal (functional) regressors, two-dimensional surfaces, non-normal responses, quantile (expectile) modelling, among others. Strong connections with mixed models and Bayesian analysis have been established. We give an overview of many of the central developments during the first two decades of P-splines.Peer Reviewe

    Twenty years of P-splines

    Get PDF
    P-splines first appeared in the limelight twenty years ago. Since then they have become popular in applications and in theoretical work. The combination of a rich B-spline basis and a simple difference penalty lends itself well to a variety of generalizations, because it is based on regression. In effect, P-splines allow the building of a “backbone” for the “mixing and matching” of a variety of additive smooth structure components, while inviting all sorts of extensions: varying-coefficient effects, signal (functional) regressors, two-dimensional surfaces, non-normal responses, quantile (expectile) modelling, among others. Strong connections with mixed models and Bayesian analysis have been established. We give an overview of many of the central developments during the first two decades of P-splines

    Computer-based and paper-and-pencil tests: A study in calculus for STEM majors

    Get PDF
    Computer-based testing is an expanding use of technology offering advantages to teachers and students. We studied Calculus II classes for STEM majors using different testing modes. Three sections with 324 students employed: Paper-and-pencil testing, computer-based testing, and both. Computer tests gave immediate feedback, allowed multiple submissions, and pooling. Paper-and-pencil tests required work and explanation allowing inspection of high cognitive demand tasks. Each test mode used the strength of its method. Students were given the same lecture by the same instructor on the same day and the same homework assignments and due dates. The design is quasi-experimental, but students were not aware of the testing mode at registration. Two basic questions examined were: (1) Do paper-and-pencil and computer-based tests measure knowledge and skill in STEM Calculus II in a consistent manner? (2) How does the knowledge and skill gained by students in a fully computer-based Calculus II class compare to students in a class requiring pencil-and-paper tests and hence some paper-and-pencil work. These results indicate that computer-based tests are as consistent with paper-and-pencil tests as computer-based tests are with themselves. Results are also consistent with classes using paper-and-pencil tests having slightly better outcomes than fully computer-based classes using only computer assessments.Comment: 33 pages, 1 figure, 9 table

    Transforms of pseudo-Boolean random variables

    Get PDF
    As in earlier works, we consider {0, 1}n as a sample space with a probability measure on it, thus making pseudo-Boolean functions into random variables. Under the assumption that the coordinate random variables are independent, we show it is very easy to give an orthonormal basis for the space of pseudo-Boolean random variables of degree at most k. We use this orthonormal basis to find the transform of a given pseudo-Boolean random variable and to answer various least squares minimization questions. © 2009 Elsevier B.V. All rights reserved

    Characteristics of Injuries in the Logging Industry of Louisiana, USA: 1986 to 1998

    Get PDF
    Characterizing injuries and their trends will allow safety managers to concentrate their resources on the areas of safety that will be most effective in the workplace. Injuries reported to the Louisiana Office of Workers' Compensation Administration for 1986 to 1998 were characterized according to the part of the body affected, the nature of the injury, the source of the injury, and the type of accident for the timber harvesting industry. Many of the injuries in the logging sector were sprains / strains to the knees. Injuries resulting from falling onto structures and surfaces were common and rising. Although the number of accidents in each category is generally decreasing, some trends should be of concern. There was no significant linear trend in overall accident rates since 1991. While the proportion of cuts and lacerations declined, the proportion of fractures increased. This coincided with a time period when logging operations in Louisiana experienced rapid mechanization and insurance companies started enforcing the use of personal protective equipment. The proportion of transportation accidents rose more than any other category. Some suggestions on focusing and improving current safety programs are given. The need for continued and improved training of managers and employees seems to be most critical

    Myo-Inositol and phytate are toxic to Formosan subterranean termites (Isoptera: Rhinotermitidae)

    Get PDF
    © 2014 Entomological Society of America. Several rare and common monosaccharides were screened for toxic effects on the Formosan subterranean termite, Coptotermes formosanus Shiraki, with the aim of identifying environmentally friendly termiticides. myo-Inositol and phytic acid, which are nontoxic to mammals, were identified as potential termite control compounds. Feeding bioassays with termite workers, where both compounds were supplied on filter paper in concentrations from 160.2 to 1,281.7μg/mm3, showed concentration-dependent toxicity within 2 wk. Interestingly myo-inositol was nontoxic when administered to termites in agar (40 mg/ml) in the absence of a cellulosic food source, an unexplained phenomenon. In addition, decreased populations of termite hindgut protozoa were observed upon feeding on myo-inositol but not phytate-spiked filter paper. Radiotracer feeding studies using myoinositol-[ 2-3H] with worker termites showed no metabolism after ingestion over a 2-d feeding period, ruling out metabolites responsible for the selective toxicity

    Inequalities' Impacts: State of the Art Review

    Get PDF
    By way of introduction This report provides the fi rm foundation for anchoring the research that will be performed by the GINI project. It subsequently considers the fi elds covered by each of the main work packages: ● inequalities of income, wealth and education, ● social impacts, ● political and cultural impacts, and ● policy effects on and of inequality. Though extensive this review does not pretend to be exhaustive. The review may be “light” in some respects and can be expanded when the analysis evolves. In each of the four fi elds a signifi cant number of discussion papers will be produced, in total well over 100. These will add to the state of the art while also covering new round and generating results that will be incorporated in the Analysis Reports to be prepared for the work packages. In that sense, the current review provides the starting point. At the same time, the existing body of knowledge is broader or deeper depending on the particular fi eld and its tradition of research. The very motivation of GINI’s focused study of the impacts of inequalities is that a systematic study is lacking and relatively little is known about those impacts. This also holds for the complex collection of, the effects that inequality can have on policy making and the contributions that policies can make to mitigating inequalities but also to enhancing them. By contrast, analyses of inequality itself are many, not least because there is a wide array of inequalities; inequalities have become more easily studied comparatively and much of that analysis has a signifi cant descriptive fl avour that includes an extensive discussion of measurement issues. @GINI hopes to go beyond that and cover the impacts of inequalities at the same time
    corecore