219 research outputs found

    Perceived usefulness of a distributed community-based syndromic surveillance system: a pilot qualitative evaluation study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>We conducted a pilot utility evaluation and information needs assessment of the Distribute Project at the 2010 Washington State Public Health Association (WSPHA) Joint Conference. Distribute is a distributed community-based syndromic surveillance system and network for detection of influenza-like illness (ILI). Using qualitative methods, we assessed the perceived usefulness of the Distribute system and explored areas for improvement. Nine state and local public health professionals participated in a focus group (<it>n = 6</it>) and in semi-structured interviews (<it>n = 3</it>). Field notes were taken, summarized and analyzed.</p> <p>Findings</p> <p>Several emergent themes that contribute to the perceived usefulness of system data and the Distribute system were identified: 1) <it>Standardization: </it>a common ILI syndrome definition; 2) <it>Regional Comparability: </it>views that support county-by-county comparisons of syndromic surveillance data; 3) <it>Completeness: </it>complete data for all expected data at a given time; <it>4) Coverage: </it>data coverage of all jurisdictions in WA state; 5) <it>Context: </it>metadata incorporated into the views to provide context for graphed data; 6) <it>Trusted Data</it>: verification that information is valid and timely; and 7) <it>Customization: </it>the ability to customize views as necessary. As a result of the focus group, a new county level health jurisdiction expressed interest in contributing data to the Distribute system.</p> <p>Conclusion</p> <p>The resulting themes from this study can be used to guide future information design efforts for the Distribute system and other syndromic surveillance systems. In addition, this study demonstrates the benefits of conducting a low cost, qualitative evaluation at a professional conference.</p

    Radiomic signatures of posterior fossa ependymoma: Molecular subgroups and risk profiles

    Get PDF
    BACKGROUND: The risk profile for posterior fossa ependymoma (EP) depends on surgical and molecular status [Group A (PFA) versus Group B (PFB)]. While subtotal tumor resection is known to confer worse prognosis, MRI-based EP risk-profiling is unexplored. We aimed to apply machine learning strategies to link MRI-based biomarkers of high-risk EP and also to distinguish PFA from PFB. METHODS: We extracted 1800 quantitative features from presurgical T2-weighted (T2-MRI) and gadolinium-enhanced T1-weighted (T1-MRI) imaging of 157 EP patients. We implemented nested cross-validation to identify features for risk score calculations and apply a Cox model for survival analysis. We conducted additional feature selection for PFA versus PFB and examined performance across three candidate classifiers. RESULTS: For all EP patients with GTR, we identified four T2-MRI-based features and stratified patients into high- and low-risk groups, with 5-year overall survival rates of 62% and 100%, respectively (p < 0.0001). Among presumed PFA patients with GTR, four T1-MRI and five T2-MRI features predicted divergence of high- and low-risk groups, with 5-year overall survival rates of 62.7% and 96.7%, respectively (p = 0.002). T1-MRI-based features showed the best performance distinguishing PFA from PFB with an AUC of 0.86. CONCLUSIONS: We present machine learning strategies to identify MRI phenotypes that distinguish PFA from PFB, as well as high- and low-risk PFA. We also describe quantitative image predictors of aggressive EP tumors that might assist risk-profiling after surgery. Future studies could examine translating radiomics as an adjunct to EP risk assessment when considering therapy strategies or trial candidacy

    Prevalence of Disorders Recorded in Dogs Attending Primary-Care Veterinary Practices in England

    Get PDF
    Purebred dog health is thought to be compromised by an increasing occurence of inherited diseases but inadequate prevalence data on common disorders have hampered efforts to prioritise health reforms. Analysis of primary veterinary practice clinical data has been proposed for reliable estimation of disorder prevalence in dogs. Electronic patient record (EPR) data were collected on 148,741 dogs attending 93 clinics across central and south-eastern England. Analysis in detail of a random sample of EPRs relating to 3,884 dogs from 89 clinics identified the most frequently recorded disorders as otitis externa (prevalence 10.2%, 95% CI: 9.1-11.3), periodontal disease (9.3%, 95% CI: 8.3-10.3) and anal sac impaction (7.1%, 95% CI: 6.1-8.1). Using syndromic classification, the most prevalent body location affected was the head-and-neck (32.8%, 95% CI: 30.7-34.9), the most prevalent organ system affected was the integument (36.3%, 95% CI: 33.9-38.6) and the most prevalent pathophysiologic process diagnosed was inflammation (32.1%, 95% CI: 29.8-34.3). Among the twenty most-frequently recorded disorders, purebred dogs had a significantly higher prevalence compared with crossbreds for three: otitis externa (P = 0.001), obesity (P = 0.006) and skin mass lesion (P = 0.033), and popular breeds differed significantly from each other in their prevalence for five: periodontal disease (P = 0.002), overgrown nails (P = 0.004), degenerative joint disease (P = 0.005), obesity (P = 0.001) and lipoma (P = 0.003). These results fill a crucial data gap in disorder prevalence information and assist with disorder prioritisation. The results suggest that, for maximal impact, breeding reforms should target commonly-diagnosed complex disorders that are amenable to genetic improvement and should place special focus on at-risk breeds. Future studies evaluating disorder severity and duration will augment the usefulness of the disorder prevalence information reported herein

    Using Ontario's "Telehealth" health telephone helpline as an early-warning system: a study protocol

    Get PDF
    BACKGROUND: The science of syndromic surveillance is still very much in its infancy. While a number of syndromic surveillance systems are being evaluated in the US, very few have had success thus far in predicting an infectious disease event. Furthermore, to date, the majority of syndromic surveillance systems have been based primarily in emergency department settings, with varying levels of enhancement from other data sources. While research has been done on the value of telephone helplines on health care use and patient satisfaction, very few projects have looked at using a telephone helpline as a source of data for syndromic surveillance, and none have been attempted in Canada. The notable exception to this statement has been in the UK where research using the national NHS Direct system as a syndromic surveillance tool has been conducted. METHODS/DESIGN: The purpose of our proposed study is to evaluate the effectiveness of Ontario's telephone nursing helpline system as a real-time syndromic surveillance system, and how its implementation, if successful, would have an impact on outbreak event detection in Ontario. Using data collected retrospectively, all "reasons for call" and assigned algorithms will be linked to a syndrome category. Using different analytic methods, normal thresholds for the different syndromes will be ascertained. This will allow for the evaluation of the system's sensitivity, specificity and positive predictive value. The next step will include the prospective monitoring of syndromic activity, both temporally and spatially. DISCUSSION: As this is a study protocol, there are currently no results to report. However, this study has been granted ethical approval, and is now being implemented. It is our hope that this syndromic surveillance system will display high sensitivity and specificity in detecting true outbreaks within Ontario, before they are detected by conventional surveillance systems. Future results will be published in peer-reviewed journals so as to contribute to the growing body of evidence on syndromic surveillance, while also providing an non US-centric perspective

    Lessons Learned From the Design and Implementation of Myocardial Infarction Adjudication Tailored for HIV Clinical Cohorts

    Get PDF
    We developed, implemented, and evaluated a myocardial infarction (MI) adjudication protocol for cohort research of human immunodeficiency virus. Potential events were identified through the centralized Centers for AIDS Research Network of Integrated Clinical Systems data repository using MI diagnoses and/or cardiac enzyme laboratory results (1995–2012). Sites assembled de-identified packets, including physician notes and results from electrocardiograms, procedures, and laboratory tests. Information pertaining to the specific antiretroviral medications used was redacted for blinded review. Two experts reviewed each packet, and a third review was conducted if discrepancies occurred. Reviewers categorized probable/definite MIs as primary or secondary and identified secondary causes of MIs. The positive predictive value and sensitivity for each identification/ascertainment method were calculated. Of the 1,119 potential events that were adjudicated, 294 (26%) were definite/probable MIs. Almost as many secondary (48%) as primary (52%) MIs occurred, often as the result of sepsis or cocaine use. Of the patients with adjudicated definite/probable MIs, 78% had elevated troponin concentrations (positive predictive value = 57%, 95% confidence interval: 52, 62); however, only 44% had clinical diagnoses of MI (positive predictive value = 45%, 95% confidence interval: 39, 51). We found that central adjudication is crucial and that clinical diagnoses alone are insufficient for ascertainment of MI. Over half of the events ultimately determined to be MIs were not identified by clinical diagnoses. Adjudication protocols used in traditional cardiovascular disease cohorts facilitate cross-cohort comparisons but do not address issues such as identifying secondary MIs that may be common in persons with human immunodeficiency virus

    Patients' perceived needs of osteoarthritis health information: A systematic scoping review

    Get PDF
    Background: Optimal management of osteoarthritis requires active patient participation. Understanding patients’ perceived health information needs is important in order to optimize health service delivery and health outcomes in osteoarthritis. We aimed to review the existing literature regarding patients’ perceived health information needs for OA. Methods: A systematic scoping review was performed of publications in MEDLINE, EMBASE, CINAHL and PsycINFO (1990–2016). Descriptive data regarding study design and methodology were extracted and risk of bias assessed. Aggregates of patients’ perceived needs of osteoarthritis health information were categorized. Results: 30 studies from 2876 were included: 16 qualitative, 11 quantitative and 3 mixed-methods studies. Three areas of perceived need emerged: (1) Need for clear communication: terms used were misunderstood or had unintended connotations. Patients wanted clear explanations. (2) Need for information from various sources: patients wanted accessible health professionals with specialist knowledge of arthritis. The Internet, whilst a source of information, was acknowledged to have dubious reliability. Print media, television, support groups, family and friends were utilised to fulfil diverse information needs. (3) Needs of information content: patients desired more information about diagnosis, prognosis, management and prevention. Conclusions: Patients desire more information regarding the diagnosis of osteoarthritis, its impact on daily life and its long-term prognosis. They want more information not only about pharmacological management options, but also non-pharmacological options to help them manage their symptoms. Also, patients wanted this information to be delivered in a clear manner from multiple sources of health information. To address these gaps, more effective communication strategies are required. The use of a variety of sources and modes of delivery may enable the provision of complementary material to provide information more successfully, resulting in better patient adherence to guidelines and improved health outcomes

    Reviewing the integration of patient data: how systems are evolving in practice to meet patient needs

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The integration of Information Systems (IS) is essential to support shared care and to provide consistent care to individuals – patient-centred care. This paper identifies, appraises and summarises studies examining different approaches to integrate patient data from heterogeneous IS.</p> <p>Methods</p> <p>The literature was systematically reviewed between 1995–2005 to identify articles mentioning patient records, computers and data integration or sharing.</p> <p>Results</p> <p>Of 3124 articles, 84 were included describing 56 distinct projects. Most of the projects were on a regional scale. Integration was most commonly accomplished by messaging with pre-defined templates and middleware solutions. HL7 was the most widely used messaging standard. Direct database access and web services were the most common communication methods. The user interface for most systems was a Web browser. Regarding the type of medical data shared, 77% of projects integrated diagnosis and problems, 67% medical images and 65% lab results. More recently significantly more IS are extending to primary care and integrating referral letters.</p> <p>Conclusion</p> <p>It is clear that Information Systems are evolving to meet people's needs by implementing regional networks, allowing patient access and integration of ever more items of patient data. Many distinct technological solutions coexist to integrate patient data, using differing standards and data architectures which may difficult further interoperability.</p

    Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation

    Get PDF
    Background Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. Methods We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. Results The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N − 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. Conclusions The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians
    corecore