220 research outputs found

    The effect of data cleaning on record linkage quality

    Get PDF
    Background: Within the field of record linkage, numerous data cleaning and standardisation techniques are employed to ensure the highest quality of links. While these facilities are common in record linkage software packages and are regularly deployed across record linkage units, little work has been published demonstrating the impact of data cleaning on linkage quality.Methods: A range of cleaning techniques was applied to both a synthetically generated dataset and a large administrative dataset previously linked to a high standard. The effect of these changes on linkage quality was investigated using pairwise F-measure to determine quality.Results: Data cleaning made little difference to the overall linkage quality, with heavy cleaning leading to a decrease in quality. Further examination showed that decreases in linkage quality were due to cleaning techniques typically reducing the variability – although correct records were now more likely to match, incorrect records were also more likely to match, and these incorrect matches outweighed the correct matches, reducing quality overall.Conclusions: Data cleaning techniques have minimal effect on linkage quality. Care should be taken during the data cleaning process

    The Association of Community-Based Palliative Care With Reduced Emergency Department Visits in the Last Year of Life Varies by Patient Factors

    Get PDF
    Study objective: Palliative care has been shown to reduce end-of-life emergency department (ED) use. Our objective was to determine how the association of community-based palliative care with reduced ED visits in the last year of life varied by patient factors. Methods: This was a retrospective cohort study of 11,875 decedents who died with neoplasms, heart failure, renal failure, chronic obstructive pulmonary disease, or liver failure in Western Australia in 2009 to 2010. Outcome measures were adjusted hazard ratios (HRs) and daily (hazard) rates of ED visits. Results: The adjusted average rate of ED visits for the cohort was reduced 50% (HR 0.50; 95% confidence interval [CI] 0.47 to 0.53) during periods of receipt of community-based palliative care. This relative reduction in ED visits varied by patient factors, ranging from 43% (HR 0.57; 95% CI 0.45 to 0.74) for decedents aged 60 years and younger up to 71% (HR 0.29; 95% CI 0.18 to 0.46) for people aged 90 years and older. Decedents living in the most disadvantaged areas had a 44% (HR 0.56; 95% CI 0.44 to 0.72) reduction in ED visits when receiving community-based palliative care compared with a 60% (HR 0.40; 95% CI 0.31 to 0.53) reduction for decedents who lived in the least disadvantaged areas and received this care. The ED visit rates while patients were receiving palliative care also varied by ED visit history, partner status, and region of residence. Conclusion: Receipt of community-based palliative care in the last year of life was associated with a reduced rate of ED visits. The magnitude of this association was modified by patient health, as well as social and demographic factors. © 2016 American College of Emergency Physicians

    Use of graph theory measures to identify errors in record linkage

    Get PDF
    Ensuring high linkage quality is important in many record linkage applications. Current methods for ensuring quality are manual and resource intensive. This paper seeks to determine the effectiveness of graph theory techniques in identifying record linkage errors. A range of graph theory techniques was applied to two linked datasets, with known truth sets. The ability of graph theory techniques to identify groups containing errors was compared to a widely used threshold setting technique. This methodology shows promise; however, further investigations into graph theory techniques are required. The development of more efficient and effective methods of improving linkage quality will result in higher quality datasets that can be delivered to researchers in shorter timeframes

    Cross-border hospital use: analysis using data linkage across four Australian states

    Get PDF
    Objective: To determine the quality and effectiveness of national data linkage capacity by performing a proof-of-concept project investigating cross-border hospital use and hospital-related deaths. Design, participants and setting: Analysis of person-level linked hospital separation and death registration data of all public and private hospital patients in New South Wales, Queensland and Western Australia and of public hospital patients in South Australia, totalling 7.7 million hospital patients from 1 July 2004 to 30 June 2009. Main outcome measures: Counts and proportions of hospital stays and patient movement patterns. Results: 223 262 patients (3.0%) travelled across a state border to attend hospitals, in particular, far northern and western NSW patients travelling to Queensland and SA hospitals, respectively. A further 48 575 patients (0.6%) moved their place of residence interstate between hospital visits, particularly to and from areas associated with major mining and tourism industries. Over 11 000 cross-border hospital transfers were also identified. Of patients who travelled across a state border to hospital, 2800 (1.3%) died in that hospital. An additional 496 deaths recorded in one jurisdiction occurred within 30 days of hospital separation from another jurisdiction. Conclusions: Access to person-level data linked across jurisdictions identified geographical hot spots of cross-border hospital use and hospitalrelated deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations

    A simple sampling method for estimating the accuracy of large scale record linkage projects

    Get PDF
    Background: Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. Objectives: The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. Methods: In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. Results: The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). Conclusions: This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions

    An evaluation framework for comparing geocoding systems

    Get PDF
    BACKGROUND: Geocoding, the process of converting textual information describing a location into one or more digital geographic representations, is a routine task performed at large organizations and government agencies across the globe. In a health context, this task is often a fundamental first step performed prior to all operations that take place in a spatially-based health study. As such, the quality of the geocoding system used within these agencies is of paramount concern to the agency (the producer) and researchers or policy-makers who wish to use these data (consumers). However, geocoding systems are continually evolving with new products coming on the market continuously. Agencies must develop and use criteria across a number axes when faced with decisions about building, buying, or maintaining any particular geocoding systems. To date, published criteria have focused on one or more aspects of geocode quality without taking a holistic view of a geocoding system’s role within a large organization. The primary purpose of this study is to develop and test an evaluation framework to assist a large organization in determining which geocoding systems will meet its operational needs.METHODS: A geocoding platform evaluation framework is derived through an examination of prior literature on geocoding accuracy. The framework developed extends commonly used geocoding metrics to take into account the specific concerns of large organizations for which geocoding is a fundamental operational capability tightly-knit into its core mission of processing health data records. A case study is performed to evaluate the strengths and weaknesses of five geocoding platforms currently available in the Australian geospatial marketplace.RESULTS: The evaluation framework developed in this research is proven successful in differentiating between key capabilities of geocoding systems that are important in the context of a large organization with significant investments in geocoding resources. Results from the proposed methodology highlight important differences across all axes of geocoding system comparisons including spatial data output accuracy, reference data coverage, system flexibility, the potential for tight integration, and the need for specialized staff and/or development time and funding. Such results can empower decisions-makers within large organizations as they make decisions and investments in geocoding systems

    Quasi-extinction risk and population targets for the Eastern, migratory population of monarch butterflies (Danaus plexippus)

    Get PDF
    The Eastern, migratory population of monarch butterflies (Danaus plexippus), an iconic North American insect, has declined by ~80% over the last decade. The monarch’s multi-generational migration between overwintering grounds in central Mexico and the summer breeding grounds in the northern U.S. and southern Canada is celebrated in all three countries and creates shared management responsibilities across North America. Here we present a novel Bayesian multivariate auto-regressive state-space model to assess quasi-extinction risk and aid in the establishment of a target population size for monarch conservation planning. We find that, given a range of plausible quasi-extinction thresholds, the population has a substantial probability of quasi-extinction, from 11–57% over 20 years, although uncertainty in these estimates is large. Exceptionally high population stochasticity, declining numbers, and a small current population size act in concert to drive this risk. An approximately 5-fold increase of the monarch population size (relative to the winter of 2014–15) is necessary to halve the current risk of quasi-extinction across all thresholds considered. Conserving the monarch migration thus requires active management to reverse population declines, and the establishment of an ambitious target population size goal to buffer against future environmentally driven variability

    A Programmatic Approach to Patient Blood Management – reducing transfusions and improving patient outcomes

    Get PDF
    In July 2008, the Western Australia (WA) Department of Health embarked on a landmark 5-year project to implement a sustainable comprehensive health-system-wide Patient Blood Management Program. Fundamentally, it was a quality and safety initiative, which also had profound resource and economic implications. Unsustainable escalating direct and indirect costs of blood, potentially severe blood shortages due to changing population dynamics, donor deferrals, loss of altruism, wide variations in transfusion practice and growing knowledge of transfusion limitations and adverse outcomes necessitate a paradigm shift in the management of anemia and blood loss. The concept of patient-focused blood management is proving to be an effective force for change. This approach has now evolved to embrace comprehensive hospital-wide Patient Blood Management Programs. These programs show significant reductions in blood utilisation, reduced costs while achieving similar or improved patient outcomes. The WA Program is achieving these outcomes across a health jurisdiction in a sustained manner

    Prevalence of blindness in Western Australia: a population study using capture and recapture techniques

    Get PDF
    Aim: To determine the prevalence of blinding eye disease in Western Australia using a capture and recapture methodology. Methods: Three independent lists of residents of Western Australia who were also legally blind were collated during the capture periods in 2008–9. The first list was obtained from the state-wide blind register. A second list comprised patients routinely attending hospital outpatient eye clinics over a 6-month period in 2008. The third list was patients attending ophthalmologists' routine clinical appointments over a 6-week period in 2009. Lists were compared to identify those individuals who were captured on each list and those who were recaptured by subsequent lists. Log-linear models were used to calculate the best fit and estimate the prevalence of blindness in the Western Australian population and extrapolated to a national prevalence of blindness in Australia. Results: 1771 legally blind people were identified on three separate lists. The best estimate of the prevalence of blindness in Western Australia was 3384 (95% CI 2947 to 3983) or 0.15% of the population of 2.25 million. Extrapolating to the national population (21.87 million) gave a prevalence of legal blindness of approximately 32 892 or 0.15%. Conclusion: Capture–recapture techniques can be used to determine the prevalence of blindness in whole populations. The calculated prevalence of blindness suggested that up to 30% of legally blind people may not be receiving available financial support and up to 60% were not accessing rehabilitation services
    • …
    corecore