14,594 research outputs found

    Data standards for access to and utilization of PGR

    Get PDF

    Patient Risk and Data Standards in Healthcare Supply Chain

    Get PDF
    Patient safety is one of the most important health care challenges. It is a big concern since 1 in every 10 patients around the world is affected by healthcare errors. The focus of this study is given to preventable adverse events that caused by the errors or system flaw that could have been avoided. In this study, simulation models are developed using Arena to evaluate the impact of GS1 data standards on patient risk in healthcare supply chain. The focus was given to the provider hospital supply chain operations where inventory discrepancy and performance deficiencies in recall, return, and outdate management can directly affect patient safety. Simulation models are developed for various systems and scenarios to compare different performance measures and analyze the impact of GS1. The results indicates that as the validation points are closer to the point of use, the number of recalled or outdated products administered to a patient are still reduced significantly so checking at the bedside or PAR is critical. But validation only at these points may cause some problems such as stock outs; therefore, validating in other locations is also needed

    Flow cytometry data standards

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Flow cytometry is a widely used analytical technique for examining microscopic particles, such as cells. The Flow Cytometry Standard (FCS) was developed in 1984 for storing flow data and it is supported by all instrument and third party software vendors. However, FCS does not capture the full scope of flow cytometry (FCM)-related data and metadata, and data standards have recently been developed to address this shortcoming.</p> <p>Findings</p> <p>The Data Standards Task Force (DSTF) of the International Society for the Advancement of Cytometry (ISAC) has developed several data standards to complement the raw data encoded in FCS files. Efforts started with the Minimum Information about a Flow Cytometry Experiment, a minimal data reporting standard of details necessary to include when publishing FCM experiments to facilitate third party understanding. MIFlowCyt is now being recommended to authors by publishers as part of manuscript submission, and manuscripts are being checked by reviewers and editors for compliance. Gating-ML was then introduced to capture gating descriptions - an essential part of FCM data analysis describing the selection of cell populations of interest. The Classification Results File Format was developed to accommodate results of the gating process, mostly within the context of automated clustering. Additionally, the Archival Cytometry Standard bundles data with all the other components describing experiments. Here, we introduce these recent standards and provide the very first example of how they can be used to report FCM data including analysis and results in a standardized, computationally exchangeable form.</p> <p>Conclusions</p> <p>Reporting standards and open file formats are essential for scientific collaboration and independent validation. The recently developed FCM data standards are now being incorporated into third party software tools and data repositories, which will ultimately facilitate understanding and data reuse.</p

    Flow cytometry data standards

    Get PDF
    Background: Flow cytometry is a widely used analytical technique for examining microscopic particles, such as cells. The Flow Cytometry Standard (FCS) was developed in 1984 for storing flow data and it is supported by all instrument and third party software vendors. However, FCS does not capture the full scope of flow cytometry (FCM)-related data and metadata, and data standards have recently been developed to address this shortcoming. Findings. The Data Standards Task Force (DSTF) of the International Society for the Advancement of Cytometry (ISAC) has developed several data standards to complement the raw data encoded in FCS files. Efforts started with the Minimum Information about a Flow Cytometry Experiment, a minimal data reporting standard of details necessary to include when publishing FCM experiments to facilitate third party understanding. MIFlowCyt is now being recommended to authors by publishers as part of manuscript submission, and manuscripts are being checked by reviewers and editors for compliance. Gating-ML was then introduced to capture gating descriptions - an essential part of FCM data analysis describing the selection of cell populations of interest. The Classification Results File Format was developed to accommodate results of the gating process, mostly within the context of automated clustering. Additionally, the Archival Cytometry Standard bundles data with all the other components describing experiments. Here, we introduce these recent standards and provide the very first example of how they can be used to report FCM data including analysis and results in a standardized, computationally exchangeable form. Conclusions: Reporting standards and open file formats are essential for scientific collaboration and independent validation. The recently developed FCM data standards are now being incorporated into third party software tools and data repositories, which will ultimately facilitate understanding and data reuse. © 2011 Brinkman et al; licensee BioMed Central Ltd

    The Cardiology Audit and Registration Data Standards (CARDS), European data standards for clinical cardiology practice

    Get PDF
    AIMS: Systematic registration of data from clinical practice is important for clinical care, local, national and international registries, and audit. Data to be collected for these different purposes should be harmonized. Therefore, during Ireland's Presidency of the European Union (EU) (January to June 2004), the Department of Health and Children worked with the European Society of Cardiology, the Irish Cardiac Society, and the European Commission to develop data standards for clinical cardiology. The Cardiology Audit and Registration Data Standards (CARDS) Project aimed to agree standards for three modules of cardiovascular health information systems: acute coronary syndromes (ACS), percutaneous coronary interventions (PCI), and clinical electrophysiology (pacemakers, implantable cardioverter defibrillators, and ablation procedures). METHODS AND RESULTS: Data items from existing registries and surveys were reviewed to derive draft data standards (variables, coding, and definitions). Variables common to the three modules include demographics, risk factors, medication, and discharge and follow-up data. Modules about a procedure contain variables on the l

    Recent developments in optical interferometry data standards

    Get PDF
    A working group on interferometry data standards has been established within IAU Commission 54 (Optical/ Infrared Interferometry). The working group includes members representing the major optical interferometry projects worldwide, and aims to enhance existing standards and develop new ones to satisfy the broad interests of the optical interferometry community. We present the initial work of the group to enhance the OIFITS data exchange standard, and outline the software packages and libraries now available which implement the standard

    Open Data Standards for Administrative Data Processing

    Get PDF
    Adoption of non-traditional data sources to augment or replace traditional survey vehicles can reduce respondent burden, provide more timely information for policy makers, and gain insights into the society that may otherwise be hidden or missed through traditional survey vehicles. The use of non-traditional data sources imposes several technological challenges due to the volume, velocity and quality of the data. The lack of applied industry-standard data format is a limiting factor which affects the reception, processing and analysis of these data sources. The adoption of a standardized, cross-language, in-memory data format that is organized for efficient analytic operations on modern hardware as a system of record for all administrative data sources has several implications: Enables the efficient use of computational resources related to I/O, processing and storage. Improves data sharing, management and governance capabilities. Increases analyst accessibility to tools, technologies and methods. Statistics Canada developed a framework for selecting computing architecture models for efficient data processing based on benchmark data pipelines representative of common administrative data processes. The data pipelines demonstrate the benefits of a standardized data format for data management, and the efficient use of computational resources. The data pipelines define the preprocessing requirements, data ingestion, data conversion, and metadata modeling, for integration into a common computing architecture. The integration of a standardized data format into a distributed data processing framework based on container technologies is discussed as a general technique to process large volumes of administrative data

    All-Payer Claims Database Development Manual: Establishing a Foundation for Health Care Transparency and Informed Decision Making

    Get PDF
    With support from the Gary and Mary West Health Policy Center, the APCD Council has developed a manual for states to develop all-payer claims databases. Titled All-Payer Claims Database Development Manual: Establishing a Foundation for Health Care Transparency and Informed Decision Making, the manual is a first-of its-kind resource that provides states with detailed guidance on common data standards, collection, aggregation and analysis involved with establishing these databases
    • …
    corecore