808 research outputs found

    A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review

    Get PDF
    © Kelsey Flott, Ryan Callahan, Ara Darzi, Erik Mayer.Background: Digital maturity is the extent to which digital technologies are used as enablers to deliver a high-quality health service. Extensive literature exists about how to assess the components of digital maturity, but it has not been used to design a comprehensive framework for evaluation. Consequently, the measurement systems that do exist are limited to evaluating digital programs within one service or care setting, meaning that digital maturity evaluation is not accounting for the needs of patients across their care pathways. Objective: The objective of our study was to identify the best methods and metrics for evaluating digital maturity and to create a novel, evidence-based tool for evaluating digital maturity across patient care pathways. Methods: We systematically reviewed the literature to find the best methods and metrics for evaluating digital maturity. We searched the PubMed database for all papers relevant to digital maturity evaluation. Papers were selected if they provided insight into how to appraise digital systems within the health service and if they indicated the factors that constitute or facilitate digital maturity. Papers were analyzed to identify methodology for evaluating digital maturity and indicators of digitally mature systems. We then used the resulting information about methodology to design an evaluation framework. Following that, the indicators of digital maturity were extracted and grouped into increasing levels of maturity and operationalized as metrics within the evaluation framework. Results: We identified 28 papers as relevant to evaluating digital maturity, from which we derived 5 themes. The first theme concerned general evaluation methodology for constructing the framework (7 papers). The following 4 themes were the increasing levels of digital maturity: resources and ability (6 papers), usage (7 papers), interoperability (3 papers), and impact (5 papers). The framework includes metrics for each of these levels at each stage of the typical patient care pathway. Conclusions: The framework uses a patient-centric model that departs from traditional service-specific measurements and allows for novel insights into how digital programs benefit patients across the health system

    Future broadband access network challenges

    Get PDF
    Copyright @ 2010 IEEEThe optical and wireless communication systems convergence will activate the potential capacity of photonic technology for providing the expected growth in interactive video, voice communication and data traffic services that are cost effective and a green communication service. The last decade growth of the broadband internet projects the number of active users will grow to over 2 billion globally by the end of 2014. Enabling the abandoned capacity of photonic signal processing is the promising solution for seamless transportation of the future consumer traffic demand. In this paper, the future traffic growth of the internet, wireless worldwide subscribers, and the end-users during the last and next decades is investigated. The challenges of the traditional access networks and Radio over Fiber solution are presented

    SeaWiFS Technical Report Series. Volume 7: Cloud screening for polar orbiting visible and infrared (IR) satellite sensors

    Get PDF
    Methods for detecting and screening cloud contamination from satellite derived visible and infrared data are reviewed in this document. The methods are applicable to past, present, and future polar orbiting satellite radiometers. Such instruments include the Coastal Zone Color Scanner (CZCS), operational from 1978 through 1986; the Advanced Very High Resolution Radiometer (AVHRR); the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), scheduled for launch in August 1993; and the Moderate Resolution Imaging Spectrometer (IMODIS). Constant threshold methods are the least demanding computationally, and often provide adequate results. An improvement to these methods are the least demanding computationally, and often provide adequate results. An improvement to these methods is to determine the thresholds dynamically by adjusting them according to the areal and temporal distributions of the surrounding pixels. Spatial coherence methods set thresholds based on the expected spatial variability of the data. Other statistically derived methods and various combinations of basic methods are also reviewed. The complexity of the methods is ultimately limited by the computing resources. Finally, some criteria for evaluating cloud screening methods are discussed

    Community detection and role identification in directed networks: understanding the Twitter network of the care.data debate

    Get PDF
    With the rise of social media as an important channel for the debate and discussion of public affairs, online social networks such as Twitter have become important platforms for public information and engagement by policy makers. To communicate effectively through Twitter, policy makers need to understand how influence and interest propagate within its network of users. In this chapter we use graph-theoretic methods to analyse the Twitter debate surrounding NHS Englands controversial care.data scheme. Directionality is a crucial feature of the Twitter social graph - information flows from the followed to the followers - but is often ignored in social network analyses; our methods are based on the behaviour of dynamic processes on the network and can be applied naturally to directed networks. We uncover robust communities of users and show that these communities reflect how information flows through the Twitter network. We are also able to classify users by their differing roles in directing the flow of information through the network. Our methods and results will be useful to policy makers who would like to use Twitter effectively as a communication medium

    PC-SEAPAK user's guide, version 4.0

    Get PDF
    PC-SEAPAK is designed to provide a complete and affordable capability for processing and analysis of NOAA Advanced Very High Resolution Radiometer (AVHRR) and Nimbus-7 Coastal Zone Color Scanner (CZCS) data. Since the release of version 3.0 over a year ago, significant revisions were made to the AVHRR and CZCS programs and to the statistical data analysis module, and a number of new programs were added. This new version has 114 procedures listed in its menus. The package continues to emphasize user-friendliness and interactive data analysis. Additionally, because the scientific goals of the ocean color research being conducted have shifted to larger space and time scales, batch processing capabilities were enhanced, allowing large quantities of data to be easily ingested and analyzed. The development of PC-SEAPAK was paralled by two other activities that were influential and assistive: the global CZCS processing effort at GSFC and the continued development of VAX-SEAPAK. SEAPAK incorporates the instrument calibration and support all levels of data available from the CZCS archive

    The shifted Jacobi polynomial integral operational matrix for solving Riccati differential equation of fractional order

    Get PDF
    In this article, we have applied Jacobi polynomial to solve Riccati differential equation of fractional order. To do so, we have presented a general formula for the Jacobi operational matrix of fractional integral operator. Using the Tau method, the solution of this problem reduces to the solution of a system of algebraic equations. The numerical results for the examples presented in this paper demonstrate the efficiency of the present method

    Analysis of stopping criteria for the EM algorithm in the context of patient grouping according to length of stay

    Get PDF
    The expectation maximisation (EM) algorithm is an iterative maximum likelihood procedure often used for estimating the parameters of a mixture model. Theoretically, increases in the likelihood function are guaranteed as the algorithm iteratively improves upon previously derived parameter estimates. The algorithm is considered to converge when all parameter estimates become stable and no further improvements can be made to the likelihood value. However, to reduce computational time, it is often common practice for the algorithm to be stopped before complete convergence using heuristic approaches. In this paper, we consider various stopping criteria and evaluate their effect on fitting Gaussian mixture models (GMMs) to patient length of stay (LOS) data. Although the GMM can be successfully fitted to positively skewed data such as LOS, the fitting procedure often requires many iterations of the EM algorithm. To our knowledge, no previous study has evaluated the effect of different stopping criteria on fitting GMMs to skewed distributions. Hence, the aim of this paper is to evaluate the effect of various stopping criteria in order to select and justify their use within a patient spell classification methodology. Results illustrate that criteria based on the difference in the likelihood value and on the GMM parameters may not always be a good indicator for stopping the algorithm. In fact we show that the values of the difference in the variance parameters should be used instead, as these parameters are the last to stabilise. In addition, we also specify threshold values for the other stopping criteria
    • …
    corecore