1,245 research outputs found

    The Standard Model of Particle Physics

    Get PDF
    Particle physics has evolved a coherent model that characterizes forces and particles at the most elementary level. This Standard Model, built from many theoretical and experimental studies, is in excellent accord with almost all current data. However, there are many hints that it is but an approximation to a yet more fundamental theory. We trace the development of the Standard Model and indicate the reasons for believing that it is incomplete.Comment: 25 pages, 17 figures; accepted for publication in Rev. Mod. Physics (APS centenary issue

    An Evaluation of Two Methods for Generating Synthetic HL7 Segments Reflecting Real-World Health Information Exchange Transactions

    Get PDF
    Motivated by the need for readily available data for testing an open-source health information exchange platform, we developed and evaluated two methods for generating synthetic messages. The methods used HL7 version 2 messages obtained from the Indiana Network for Patient Care. Data from both methods were analyzed to assess how effectively the output reflected original 'real-world' data. The Markov Chain method (MCM) used an algorithm based on transitional probability matrix while the Music Box model (MBM) randomly selected messages of particular trigger type from the original data to generate new messages. The MBM was faster, generated shorter messages and exhibited less variation in message length. The MCM required more computational power, generated longer messages with more message length variability. Both methods exhibited adequate coverage, producing a high proportion of messages consistent with original messages. Both methods yielded similar rates of valid messages

    Evaluating the Completeness of Data Elements of Provider Reporting on Indiana's Communicable Disease Reports

    Get PDF
    poster abstractObjective To examine the completeness of data elements required for notifiable disease surveillance from official, provider-based reports submitted to a local health department. Introduction Completeness of public health information is essential for the accurate assessment of community health progress and disease surveillance. Yet challenges persist with respect to the level of completeness that public health agencies receive in reports submitted by health care providers. Missing and incomplete data can jeopardize information reliability and quality resulting in inaccurate disease evaluation and management (1). Additionally, incomplete data can prolong the time required for disease investigators to complete their work on a reported case. Thus, it is important to determine where the scarcity of information is coming from to recognize the characteristics of provider reporting. Methods Data from 1,195 unique patient cases across 7 notifiable diseases were abstracted from official reporting forms (2) submitted to a local health department serving the Indianapolis metropolitan area. The selected diseases were chlamydia, gonorrhea, syphilis, salmonella, histoplasmosis, hepatitis B-acute, and hepatitis C-chronic. Table 1 represents the duration and collection period for each of the selected diseases. Diseases were purposely chosen to represent the broad range managed by local health departments. Diseases were purposely chosen to represent the broad range managed by local health departments. A set of data elements consisting of patient, clinical, and provider information was then evaluated for completeness. The level of completeness was determined using a classification method similar to that used by Dixon et al. (3). Fields were considered complete if they contained a value; the recorded value was not validated for accuracy. Results Table 2 depicts the level of completeness for the selected data elements across the target diseases. Completeness levels and percentages varied by disease and data element with completeness being higher for patient demographic information (e.g., name, address) than provider demographics (e.g., name, clinic address). The majority of data elements for patient demographics were categorized as mostly to always complete. Conclusion It is essential that provider reports are completed in a thorough and timely manner. To increase documentation of provider information, analyses of provider characteristics such as workflow patterns, organizational constraints, and information needs are essential to understand the completeness level of provider information reporting. This will allow us to develop implementation of strategies to increase completeness of reporting across all data elements necessary to assess and investigate notifiable diseases

    Automating Provider Reporting of Communicable Disease Cases using Health Information Technology

    Get PDF
    poster abstractIntroduction Disease surveillance is a core public health (PH) function, which enables PH authorities to monitor disease outbreak and develop programs and policies to reduce disease burden. To manage and adjudicate cases of suspected communicable disease, PH workers gather data elements about persons, clinical care, and providers from various clinical sources, including providers, laboratories, among others. Current processes are paper-based and often yield incomplete and untimely reporting across different diseases requiring time-consuming follow-up by PH authorities to get needed information. Health information technology (HIT) refers to a wide range of technologies used in health care settings, including electronic health records and laboratory information systems. Health information exchange (HIE) involves electronic sharing of data and information between HIT systems, including those used in PH. Previous research has shown that using HIE to electronically report laboratory results to PH can improve surveillance practice, yet there has been little utilization of HIE for improving provider-based disease reporting [1]. Methods Our study uses an intervention to electronically pre-populate provider-based communicable disease case reporting forms with existing clinical, laboratory and patient data available through one of the largest and oldest HIE infrastructures in the U.S., the Indiana Network for Patient Care. Evaluation of the intervention will be conducted utilizing mixed methods in a concurrent design framework in which qualitative methods are embedded within the quantitative methods. Quantitative data will include reporting rates, timeliness and burden and report completeness and accuracy, analyzed using interrupted time-series and other pre-post comparisons. Qualitative data regarding pre-post provider perceptions of report completeness, accuracy, and timeliness, reporting burden, data quality, benefits, utility, adoption, utilization and impact on reporting workflow will be collected using semi-structured interviews and open-ended survey items. Data will be triangulated to find convergence or agreement by cross-validating results to produce a contextualized portrayal of the facilitators and barriers to implementation and use of the intervention. Results The intervention has been implemented in seven primary care clinics in the metropolitan Indianapolis area plus one rural clinic in Edinburgh. Analysis of baseline data shows that provider-based reports vary in their completeness, yet they contain critical information not available from laboratory information systems [2]. Furthermore, PH workers access a range of sources to gather the data they need to investigate disease cases [3]. Discussion and Conclusion By applying mixed research methods and measuring context, facilitators and barriers, and individual, organizational and data quality factors that may impact adoption and utilization of the intervention, we will document whether and how the intervention streamlines provider-based manual reporting workflows, lowers barriers to reporting, increases data completeness, improves reporting timeliness and captures a greater portion of communicable disease burden in the community. Early results are promising, and continued evaluation will be completed over the next 24 months

    Cosmic ray tests of the D0 preshower detector

    Full text link
    The D0 preshower detector consists of scintillator strips with embedded wavelength-shifting fibers, and a readout using Visible Light Photon Counters. The response to minimum ionizing particles has been tested with cosmic ray muons. We report results on the gain calibration and light-yield distributions. The spatial resolution is investigated taking into account the light sharing between strips, the effects of multiple scattering and various systematic uncertainties. The detection efficiency and noise contamination are also investigated.Comment: 27 pages, 24 figures, submitted to NIM

    Statement from the American Linear Collider Committee to the P5 subpanel

    Full text link
    This statement from the American Linear Collider Committee to the P5 subpanel has three purposes. It presents a brief summary of the case for an e+ee^+e^- Higgs factory that has emerged from Snowmass 2021. It highlights the special virtues of the ILC that are shared with other linear colliders but not with circular colliders. Finally, it calls attention to the resources available in the ILC White Paper for Snowmass (arXiv:2203.07622). The ALCC urges P5 to move the Higgs factory forward as a global project by assigning the idea of an e+ee^+e^- Higgs factory high priority, initiating a global discussion of the technology choice and cost sharing, and offering the option of siting the Higgs factory in the U.S.Comment: 6 page

    Syndromic surveillance: STL for modeling, visualizing, and monitoring disease counts

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Public health surveillance is the monitoring of data to detect and quantify unusual health events. Monitoring pre-diagnostic data, such as emergency department (ED) patient chief complaints, enables rapid detection of disease outbreaks. There are many sources of variation in such data; statistical methods need to accurately model them as a basis for timely and accurate disease outbreak methods.</p> <p>Methods</p> <p>Our new methods for modeling daily chief complaint counts are based on a seasonal-trend decomposition procedure based on loess (STL) and were developed using data from the 76 EDs of the Indiana surveillance program from 2004 to 2008. Square root counts are decomposed into inter-annual, yearly-seasonal, day-of-the-week, and random-error components. Using this decomposition method, we develop a new synoptic-scale (days to weeks) outbreak detection method and carry out a simulation study to compare detection performance to four well-known methods for nine outbreak scenarios.</p> <p>Result</p> <p>The components of the STL decomposition reveal insights into the variability of the Indiana ED data. Day-of-the-week components tend to peak Sunday or Monday, fall steadily to a minimum Thursday or Friday, and then rise to the peak. Yearly-seasonal components show seasonal influenza, some with bimodal peaks.</p> <p>Some inter-annual components increase slightly due to increasing patient populations. A new outbreak detection method based on the decomposition modeling performs well with 90 days or more of data. Control limits were set empirically so that all methods had a specificity of 97%. STL had the largest sensitivity in all nine outbreak scenarios. The STL method also exhibited a well-behaved false positive rate when run on the data with no outbreaks injected.</p> <p>Conclusion</p> <p>The STL decomposition method for chief complaint counts leads to a rapid and accurate detection method for disease outbreaks, and requires only 90 days of historical data to be put into operation. The visualization tools that accompany the decomposition and outbreak methods provide much insight into patterns in the data, which is useful for surveillance operations.</p

    Measurement of the WW Boson Mass

    Full text link
    A measurement of the mass of the WW boson is presented based on a sample of 5982 WeνW \rightarrow e \nu decays observed in ppp\overline{p} collisions at s\sqrt{s} = 1.8~TeV with the D\O\ detector during the 1992--1993 run. From a fit to the transverse mass spectrum, combined with measurements of the ZZ boson mass, the WW boson mass is measured to be MW=80.350±0.140(stat.)±0.165(syst.)±0.160(scale)GeV/c2M_W = 80.350 \pm 0.140 (stat.) \pm 0.165 (syst.) \pm 0.160 (scale) GeV/c^2.Comment: 12 pages, LaTex, style Revtex, including 3 postscript figures (submitted to PRL

    Search for W~1Z~2\widetilde{W}_1\widetilde{Z}_2 Production via Trilepton Final States in ppˉp\bar{p} collisions at s=1.8\sqrt{s}=1.8 TeV

    Full text link
    We have searched for associated production of the lightest chargino, W~1\widetilde{W}_1, and next-to-lightest neutralino, Z~2\widetilde{Z}_2, of the Minimal Supersymmetric Standard Model in ppˉp\bar{p} collisions at \mbox{s\sqrt{s} = 1.8 TeV} using the \D0 detector at the Fermilab Tevatron collider. Data corresponding to an integrated luminosity of 12.5±0.7\pm 0.7 \ipb were examined for events containing three isolated leptons. No evidence for W~1Z~2\widetilde{W}_1\widetilde{Z}_2 pair production was found. Limits on σ(W~1Z~2)\sigma(\widetilde{W}_1\widetilde{Z}_2)Br(W~1lνZ~1)(\widetilde{W}_1\to l\nu\widetilde{Z}_1)Br(Z~2llˉZ~1)(\widetilde{Z}_2\to l\bar{l}\widetilde{Z}_1) are presented.Comment: 17 pages (13 + 1 page table + 3 pages figures). 3 PostScript figures will follow in a UUEncoded, gzip'd, tar file. Text in LaTex format. Submitted to Physical Review Letters. Replace comments - Had to resumbmit version with EPSF directive
    corecore