5,780 research outputs found

    Quality Control Measures over 30 Years in a Multicenter Clinical Study: Results from the Diabetes Control and Complications Trial / Epidemiology of Diabetes Interventions and Complications (DCCT/EDIC) Study.

    Get PDF
    Implementation of multicenter and/or longitudinal studies requires an effective quality assurance program to identify trends, data inconsistencies and process variability of results over time. The Diabetes Control and Complications Trial (DCCT) and the follow-up Epidemiology of Diabetes Interventions and Complications (EDIC) study represent over 30 years of data collection among a cohort of participants across 27 clinical centers. The quality assurance plan is overseen by the Data Coordinating Center and is implemented across the clinical centers and central reading units. Each central unit incorporates specific DCCT/EDIC quality monitoring activities into their routine quality assurance plan. The results are reviewed by a data quality assurance committee whose function is to identify variances in quality that may impact study results from the central units as well as within and across clinical centers, and to recommend implementation of corrective procedures when necessary. Over the 30-year period, changes to the methods, equipment, or clinical procedures have been required to keep procedures current and ensure continued collection of scientifically valid and clinically relevant results. Pilot testing to compare historic processes with contemporary alternatives is performed and comparability is validated prior to incorporation of new procedures into the study. Details of the quality assurance plan across and within the clinical and central reading units are described, and quality outcomes for core measures analyzed by the central reading units (e.g. biochemical samples, fundus photographs, ECGs) are presented

    Metadata Quality for Digital Libraries

    Get PDF
    The quality of metadata in a digital library is an important factor in ensuring access for end-users. Several studies have tried to define quality frameworks and assess metadata but there is little user feedback about these in the literature. As collections grow in size maintaining quality through manual methods becomes increasingly difficult for repository managers. This research presents the design and implementation of a web-based metadata analysis tool for digital repositories. The tool is built as an extension to the Greenstone3 digital library software. We present examples of the tool in use on real-world data and provide feedback from repository managers. The evidence from our studies shows that automated quality analysis tools are useful and valued service for digital libraries

    Mississippi Canyon 252 Incident NRDA Tier 1 for Deepwater Communities

    Get PDF
    The northern Gulf of Mexico (GOM) is geologically diverse basin, described as the most complex continental slope region in the world. Regional topography of the slope consists of basins, knolls, ridges, and mounds derived from the dynamic adjustments of salt and the introduction of large volumes of sediment over long time scales. More than 99% of the sea floor in the GOM consists of soft sediment made up of various mixtures of primarily silt and clay. These wide-spread soft bottom communities are described in reports from major MMS studies by Gallaway et al. (1998) and Rowe and Kennicutt (2009). Relative to soft bottoms, hard bottoms and their associated communities are relatively uncommon by are notable for their high biodiversity and complexity

    Archival Quality and Long-term Preservation: A Research Framework for Validating the Usefulness of Digital Surrogates

    Get PDF
    Digital archives accept and preserve digital content for long-term use. Increasingly, stakeholders are creating large-scale digital repositories to ingest surrogates of archival resources or digitized books whose intellectual value as surrogates may exceed that of the original sources themselves. Although digital repository developers have expended significant effort to establish the trustworthiness of repository procedures and infrastructures, relatively little attention has been paid to the quality and usefulness of the preserved content itself. In situations where digital content has been created by third party firms, content quality (or its absence in the form of unacceptable error) may directly influence repository trustworthiness. This article establishes a conceptual foundation for the association of archival quality and information quality research. It outlines a research project that is designed to develop and test measures of quality for digital content preserved in HathiTrust, a large scale preservation repository. The research establishes methods of measuring error in digitized books at the data, page, and volume level and applies the measures to statistically valid samples of digitized books, adjusting for inter-coder inconsistencies and the effects of sampling strategies. The research findings are then validated with users who conform to one of four use-case scenarios: reading online, printing on demand, data mining, and print collection management. The paper concludes with comments on the implications of assessing archival quality within a digital preservation context.Andrew W. Mellon FoundationPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/86643/1/J20 Conway Archival Quality 2011.pd

    Web Archive Services Framework for Tighter Integration Between the Past and Present Web

    Get PDF
    Web archives have contained the cultural history of the web for many years, but they still have a limited capability for access. Most of the web archiving research has focused on crawling and preservation activities, with little focus on the delivery methods. The current access methods are tightly coupled with web archive infrastructure, hard to replicate or integrate with other web archives, and do not cover all the users\u27 needs. In this dissertation, we focus on the access methods for archived web data to enable users, third-party developers, researchers, and others to gain knowledge from the web archives. We build ArcSys, a new service framework that extracts, preserves, and exposes APIs for the web archive corpus. The dissertation introduces a novel categorization technique to divide the archived corpus into four levels. For each level, we will propose suitable services and APIs that enable both users and third-party developers to build new interfaces. The first level is the content level that extracts the content from the archived web data. We develop ArcContent to expose the web archive content processed through various filters. The second level is the metadata level; we extract the metadata from the archived web data and make it available to users. We implement two services, ArcLink for temporal web graph and ArcThumb for optimizing the thumbnail creation in the web archives. The third level is the URI level that focuses on using the URI HTTP redirection status to enhance the user query. Finally, the highest level in the web archiving service framework pyramid is the archive level. In this level, we define the web archive by the characteristics of its corpus and building Web Archive Profiles. The profiles are used by the Memento Aggregator for query optimization

    BlogForever D5.1: Design and Specification of Case Studies

    Get PDF
    This document presents the specification and design of six case studies for testing the BlogForever platform implementation process. The report explains the data collection plan where users of the repository will provide usability feedback through questionnaires as well as details of scalability analysis through the creation of specific log files analytics. The case studies will investigate the sustainability of the platform, that it meets potential users’ needs and that is has an important long term impact

    Developing a Proactive Framework for E-Discovery Compliance

    Get PDF
    The purpose of this document is to provide Information Systems Management an awareness of a compliance risk associated with the management of electronic data. The changes to the Federal Rules of Civil Procedure in 2006 make electronic data discoverable as evidence for civil court cases introducing the need for proactive management of end user data beyond the data that a particular form of legislation may require. Leveraging existing forensic data collection processes and raising the awareness of the problem and risk to the organization will provide a level of assurance for compliance should the data be requested in a civil trial. This project analyzed the current state that existed for businesses and organizations, the actual risk and precedence that has been set, and determines the current state of awareness and readiness that businesses have for this problem. The project then offers a solution to this problem that will aid in reducing the risk and hardship an organization could face when electronic data is requested. Finally, this project presents the results of actual testing of the proposed solution in a real world business enterprise

    Overview of bladder heating technology: matching capabilities with clinical requirements.

    Get PDF
    Moderate temperature hyperthermia (40-45°C for 1 h) is emerging as an effective treatment to enhance best available chemotherapy strategies for bladder cancer. A rapidly increasing number of clinical trials have investigated the feasibility and efficacy of treating bladder cancer with combined intravesical chemotherapy and moderate temperature hyperthermia. To date, most studies have concerned treatment of non-muscle-invasive bladder cancer (NMIBC) limited to the interior wall of the bladder. Following the promising results of initial clinical trials, investigators are now considering protocols for treatment of muscle-invasive bladder cancer (MIBC). This paper provides a brief overview of the devices and techniques used for heating bladder cancer. Systems are described for thermal conduction heating of the bladder wall via circulation of hot fluid, intravesical microwave antenna heating, capacitively coupled radio-frequency current heating, and radiofrequency phased array deep regional heating of the pelvis. Relative heating characteristics of the available technologies are compared based on published feasibility studies, and the systems correlated with clinical requirements for effective treatment of MIBC and NMIBC
    • 

    corecore