239,046 research outputs found

    A Framework for Quality-Driven Delivery in Distributed Multimedia Systems

    Get PDF
    In this paper, we propose a framework for Quality-Driven Delivery (QDD) in distributed multimedia environments. Quality-driven delivery refers to the capacity of a system to deliver documents, or more generally objects, while considering the users expectations in terms of non-functional requirements. For this QDD framework, we propose a model-driven approach where we focus on QoS information modeling and transformation. QoS information models and meta-models are used during different QoS activities for mapping requirements to system constraints, for exchanging QoS information, for checking compatibility between QoS information and more generally for making QoS decisions. We also investigate which model transformation operators have to be implemented in order to support some QoS activities such as QoS mapping

    Interpretable Subgroup Discovery in Treatment Effect Estimation with Application to Opioid Prescribing Guidelines

    Full text link
    The dearth of prescribing guidelines for physicians is one key driver of the current opioid epidemic in the United States. In this work, we analyze medical and pharmaceutical claims data to draw insights on characteristics of patients who are more prone to adverse outcomes after an initial synthetic opioid prescription. Toward this end, we propose a generative model that allows discovery from observational data of subgroups that demonstrate an enhanced or diminished causal effect due to treatment. Our approach models these sub-populations as a mixture distribution, using sparsity to enhance interpretability, while jointly learning nonlinear predictors of the potential outcomes to better adjust for confounding. The approach leads to human-interpretable insights on discovered subgroups, improving the practical utility for decision suppor

    Dynamic Energy Management for Chip Multi-processors under Performance Constraints

    Get PDF
    We introduce a novel algorithm for dynamic energy management (DEM) under performance constraints in chip multi-processors (CMPs). Using the novel concept of delayed instructions count, performance loss estimations are calculated at the end of each control period for each core. In addition, a Kalman filtering based approach is employed to predict workload in the next control period for which voltage-frequency pairs must be selected. This selection is done with a novel dynamic voltage and frequency scaling (DVFS) algorithm whose objective is to reduce energy consumption but without degrading performance beyond the user set threshold. Using our customized Sniper based CMP system simulation framework, we demonstrate the effectiveness of the proposed algorithm for a variety of benchmarks for 16 core and 64 core network-on-chip based CMP architectures. Simulation results show consistent energy savings across the board. We present our work as an investigation of the tradeoff between the achievable energy reduction via DVFS when predictions are done using the effective Kalman filter for different performance penalty thresholds

    Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method

    Get PDF
    Background: The extensive and rapidly expanding research literature on electronic patient records (EPRs) presents challenges to systematic reviewers. This literature is heterogeneous and at times conflicting, not least because it covers multiple research traditions with different underlying philosophical assumptions and methodological approaches. Aim: To map, interpret and critique the range of concepts, theories, methods and empirical findings on EPRs, with a particular emphasis on the implementation and use of EPR systems. Method: Using the meta-narrative method of systematic review, and applying search strategies that took us beyond the Medline-indexed literature, we identified over 500 full-text sources. We used ‘conflicting’ findings to address higher-order questions about how the EPR and its implementation were differently conceptualised and studied by different communities of researchers. Main findings: Our final synthesis included 24 previous systematic reviews and 94 additional primary studies, most of the latter from outside the biomedical literature. A number of tensions were evident, particularly in relation to: [1] the EPR (‘container’ or ‘itinerary’); [2] the EPR user (‘information-processer’ or ‘member of socio-technical network’); [3] organizational context (‘the setting within which the EPR is implemented’ or ‘the EPR-in-use’); [4] clinical work (‘decision-making’ or ‘situated practice’); [5] the process of change (‘the logic of determinism’ or ‘the logic of opposition’); [6] implementation success (‘objectively defined’ or ‘socially negotiated’); and [7] complexity and scale (‘the bigger the better’ or ‘small is beautiful’). Findings suggest that integration of EPRs will always require human work to re-contextualize knowledge for different uses; that whilst secondary work (audit, research, billing) may be made more efficient by the EPR, primary clinical work may be made less efficient; that paper, far from being technologically obsolete, currently offers greater ecological flexibility than most forms of electronic record; and that smaller systems may sometimes be more efficient and effective than larger ones. Conclusions: The tensions and paradoxes revealed in this study extend and challenge previous reviews and suggest that the evidence base for some EPR programs is more limited than is often assumed. We offer this paper as a preliminary contribution to a much-needed debate on this evidence and its implications, and suggest avenues for new research

    Stochastic Yield Catastrophes and Robustness in Self-Assembly

    Get PDF
    A guiding principle in self-assembly is that, for high production yield, nucleation of structures must be significantly slower than their growth. However, details of the mechanism that impedes nucleation are broadly considered irrelevant. Here, we analyze self-assembly into finite-sized target structures employing mathematical modeling. We investigate two key scenarios to delay nucleation: (i) by introducing a slow activation step for the assembling constituents and, (ii) by decreasing the dimerization rate. These scenarios have widely different characteristics. While the dimerization scenario exhibits robust behavior, the activation scenario is highly sensitive to demographic fluctuations. These demographic fluctuations ultimately disfavor growth compared to nucleation and can suppress yield completely. The occurrence of this stochastic yield catastrophe does not depend on model details but is generic as soon as number fluctuations between constituents are taken into account. On a broader perspective, our results reveal that stochasticity is an important limiting factor for self-assembly and that the specific implementation of the nucleation process plays a significant role in determining the yield

    A minimal model for congestion phenomena on complex networks

    Full text link
    We study a minimal model of traffic flows in complex networks, simple enough to get analytical results, but with a very rich phenomenology, presenting continuous, discontinuous as well as hybrid phase transitions between a free-flow phase and a congested phase, critical points and different scaling behaviors in the system size. It consists of random walkers on a queueing network with one-range repulsion, where particles can be destroyed only if they can move. We focus on the dependence on the topology as well as on the level of traffic control. We are able to obtain transition curves and phase diagrams at analytical level for the ensemble of uncorrelated networks and numerically for single instances. We find that traffic control improves global performance, enlarging the free-flow region in parameter space only in heterogeneous networks. Traffic control introduces non-linear effects and, beyond a critical strength, may trigger the appearance of a congested phase in a discontinuous manner. The model also reproduces the cross-over in the scaling of traffic fluctuations empirically observed in the Internet, and moreover, a conserved version can reproduce qualitatively some stylized facts of traffic in transportation networks
    corecore