2,116 research outputs found

    A Quantile Variant of the EM Algorithm and Its Applications to Parameter Estimation with Interval Data

    Full text link
    The expectation-maximization (EM) algorithm is a powerful computational technique for finding the maximum likelihood estimates for parametric models when the data are not fully observed. The EM is best suited for situations where the expectation in each E-step and the maximization in each M-step are straightforward. A difficulty with the implementation of the EM algorithm is that each E-step requires the integration of the log-likelihood function in closed form. The explicit integration can be avoided by using what is known as the Monte Carlo EM (MCEM) algorithm. The MCEM uses a random sample to estimate the integral at each E-step. However, the problem with the MCEM is that it often converges to the integral quite slowly and the convergence behavior can also be unstable, which causes a computational burden. In this paper, we propose what we refer to as the quantile variant of the EM (QEM) algorithm. We prove that the proposed QEM method has an accuracy of O(1/K2)O(1/K^2) while the MCEM method has an accuracy of Op(1/K)O_p(1/\sqrt{K}). Thus, the proposed QEM method possesses faster and more stable convergence properties when compared with the MCEM algorithm. The improved performance is illustrated through the numerical studies. Several practical examples illustrating its use in interval-censored data problems are also provided

    A reference relative time-scale as an alternative to chronological age for cohorts with long follow-up

    Get PDF
    Background: Epidemiologists have debated the appropriate time-scale for cohort survival studies; chronological age or time-on-study being two such time-scales. Importantly, assessment of risk factors may depend on the choice of time-scale. Recently, chronological or attained age has gained support but a case can be made for a ‘reference relative time-scale’ as an alternative which circumvents difficulties that arise with this and other scales. The reference relative time of an individual participant is the integral of a reference population hazard function between time of entry and time of exit of the individual. The objective here is to describe the reference relative time-scale, illustrate its use, make comparison with attained age by simulation and explain its relationship to modern and traditional epidemiologic methods. Results: A comparison was made between two models; a stratified Cox model with age as the time-scale versus an un-stratified Cox model using the reference relative time-scale. The illustrative comparison used a UK cohort of cotton workers, with differing ages at entry to the study, with accrual over a time period and with long follow-up. Additionally, exponential and Weibull models were fitted since the reference relative time-scale analysis need not be restricted to the Cox model. A simulation study showed that analysis using the reference relative time-scale and analysis using chronological age had very similar power to detect a significant risk factor and both were equally unbiased. Further, the analysis using the reference relative time-scale supported fully-parametric survival modelling and allowed percentile predictions and mortality curves to be constructed. Conclusions: The reference relative time-scale was a viable alternative to chronological age, led to simplification of the modelling process and possessed the defined features of a good time-scale as defined in reliability theory. The reference relative time-scale has several interpretations and provides a unifying concept that links contemporary approaches in survival and reliability analysis to the traditional epidemiologic methods of Poisson regression and standardised mortality ratios. The community of practitioners has not previously made this connection

    Integrating knowledge of multitasking and interruptions across different perspectives and research methods

    Get PDF
    Multitasking and interruptions have been studied using a variety of methods in multiple fields (e.g., HCI, cognitive science, computer science, and social sciences). This diversity brings many complementary insights. However, it also challenges researchers to understand how seemingly disparate ideas can best be integrated to further theory and to inform the design of interactive systems. There is therefore a need for a platform to discuss how different approaches to understanding multitasking and interruptions can be combined to provide insights that are more than the sum of their parts. In this article we argue for the necessity of an integrative approach. As part of this argument we provide an overview of articles in this special issue on multitasking and interruptions. These articles showcase the variety of methods currently used to study multitasking and interruptions. It is clear that there are many challenges to studying multitasking and interruptions from different perspectives and using different techniques. We advance a six-point research agenda for the future of multi-method research on this important and timely topic

    The legal imperative for treating rare disorders

    Full text link

    Unsupervised Bayesian linear unmixing of gene expression microarrays

    Get PDF
    Background: This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Results: Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. Conclusions: The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor

    Critical animal and media studies: Expanding the understanding of oppression in communication research

    No full text
    Critical and communication studies have traditionally neglected the oppression conducted by humans towards other animals. However, our (mis)treatment of other animals is the result of public consent supported by a morally speciesist-anthropocentric system of values. Speciesism or anthroparchy, as much as any other mainstream ideologies, feeds the media and at the same time is perpetuated by them. The goal of this article is to remedy this neglect by introducing the subdiscipline of Critical Animal and Media Studies. Critical Animal and Media Studies takes inspiration both from critical animal studies – which is so far the most consolidated critical field of research in the social sciences addressing our exploitation of other animals – and from the normative-moral stance rooted in the cornerstones of traditional critical media studies. The authors argue that the Critical Animal and Media Studies approach is an unavoidable step forward for critical media and communication studies to engage with the expanded circle of concerns of contemporary ethical thinking

    Temperature variability implies greater economic damages from climate change

    Get PDF
    A number of influential assessments of the economic cost of climate change rely on just a small number of coupled climate–economy models. A central feature of these assessments is their accounting of the economic cost of epistemic uncertainty—that part of our uncertainty stemming from our inability to precisely estimate key model parameters, such as the Equilibrium Climate Sensitivity. However, these models fail to account for the cost of aleatory uncertainty—the irreducible uncertainty that remains even when the true parameter values are known. We show how to account for this second source of uncertainty in a physically well-founded and tractable way, and we demonstrate that even modest variability implies trillions of dollars of previously unaccounted for economic damages
    corecore