56 research outputs found

    Renormalization of Tamm-Dancoff Integral Equations

    Full text link
    During the last few years, interest has arisen in using light-front Tamm-Dancoff field theory to describe relativistic bound states for theories such as QCD. Unfortunately, difficult renormalization problems stand in the way. We introduce a general, non-perturbative approach to renormalization that is well suited for the ultraviolet and, presumably, the infrared divergences found in these systems. We reexpress the renormalization problem in terms of a set of coupled inhomogeneous integral equations, the ``counterterm equation.'' The solution of this equation provides a kernel for the Tamm-Dancoff integral equations which generates states that are independent of any cutoffs. We also introduce a Rayleigh-Ritz approach to numerical solution of the counterterm equation. Using our approach to renormalization, we examine several ultraviolet divergent models. Finally, we use the Rayleigh-Ritz approach to find the counterterms in terms of allowed operators of a theory.Comment: 19 pages, OHSTPY-HEP-T-92-01

    Emerging contaminant exposure to aquatic systems in the Southern African Development Community

    Get PDF
    The growing production and use of chemicals and the resultant increase in environmental exposure is of particular concern in developing countries where there is rapid industrialization and population growth but limited information on the occurrence of emerging contaminants. Advances in analytical techniques now allow for the monitoring of emerging contaminants at very low concentrations with the potential to cause harmful ecotoxicological effects. Therefore, we provide the first critical assessment of the current state of knowledge about chemical exposure in waters of the Southern African Developmental Community (SADC). We achieved this through a comprehensive literature review and the creation of a database of chemical monitoring data. Of the 59 articles reviewed, most (n = 36; 61.0%) were from South Africa, and the rest were from Botswana (n = 6; 10.2%), Zimbabwe (n = 6; 10.2%), Malawi (n = 3; 5.1%), Mozambique (n = 3; 5.1%), Zambia (n = 2; 3.4%), Angola (n = 1; 1.7%), Madagascar (n = 1; 1.7%), and Tanzania (n = 1; 1.7%). No publications were found from the remaining seven SADC countries. Emerging contaminants have only been studied in South Africa and Botswana. The antiretroviral drug ritonavir (64.52 µg/L) was detected at the highest average concentration, and ibuprofen (17 times) was detected most frequently. Despite being the primary water source in the region, groundwater was understudied (only 13 studies). High emerging contaminant concentrations in surface waters indicate the presence of secondary sources of pollution such as sewage leakage. We identify research gaps and propose actions to assess and reduce chemical pollution to enable the SADC to address the Sustainable Development Goals, particularly Goal 3.9, to reduce the deaths and illnesses from hazardous chemicals and contamination. Environ Toxicol Chem 2022;41:382–395

    The Reproducibility of Lists of Differentially Expressed Genes in Microarray Studies

    Get PDF
    Reproducibility is a fundamental requirement in scientific experiments and clinical contexts. Recent publications raise concerns about the reliability of microarray technology because of the apparent lack of agreement between lists of differentially expressed genes (DEGs). In this study we demonstrate that (1) such discordance may stem from ranking and selecting DEGs solely by statistical significance (P) derived from widely used simple t-tests; (2) when fold change (FC) is used as the ranking criterion, the lists become much more reproducible, especially when fewer genes are selected; and (3) the instability of short DEG lists based on P cutoffs is an expected mathematical consequence of the high variability of the t-values. We recommend the use of FC ranking plus a non-stringent P cutoff as a baseline practice in order to generate more reproducible DEG lists. The FC criterion enhances reproducibility while the P criterion balances sensitivity and specificity

    Transverse lattice calculation of the pion light-cone wavefunctions

    Get PDF
    We calculate the light-cone wavefunctions of the pion by solving the meson boundstate problem in a coarse transverse lattice gauge theory using DLCQ. A large-N_c approximation is made and the light-cone Hamiltonian expanded in massive dynamical fields at fixed lattice spacing. In contrast to earlier calculations, we include contributions from states containing many gluonic link-fields between the quarks.The Hamiltonian is renormalised by a combination of covariance conditions on boundstates and fitting the physical masses M_rho and M_pi, decay constant f_pi, and the string tension sigma. Good covariance is obtained for the lightest 0^{-+} state, which we identify with the pion. Many observables can be deduced from its light-cone wavefunctions.After perturbative evolution,the quark valence structure function is found to be consistent with the experimental structure function deduced from Drell-Yan pi-nucleon data in the valence region x > 0.5. In addition, the pion distribution amplitude is consistent with the experimental distribution deduced from the pi gamma^* gamma transition form factor and diffractive dissociation. A new observable we calculate is the probability for quark helicity correlation. We find a 45% probability that the valence-quark helicities are aligned in the pion.Comment: 27 pages, 9 figure

    The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe

    Get PDF
    The preponderance of matter over antimatter in the early Universe, the dynamics of the supernova bursts that produced the heavy elements necessary for life and whether protons eventually decay --- these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our Universe, its current state and its eventual fate. The Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed plan for a world-class experiment dedicated to addressing these questions. LBNE is conceived around three central components: (1) a new, high-intensity neutrino source generated from a megawatt-class proton accelerator at Fermi National Accelerator Laboratory, (2) a near neutrino detector just downstream of the source, and (3) a massive liquid argon time-projection chamber deployed as a far detector deep underground at the Sanford Underground Research Facility. This facility, located at the site of the former Homestake Mine in Lead, South Dakota, is approximately 1,300 km from the neutrino source at Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino charge-parity symmetry violation and mass ordering effects. This ambitious yet cost-effective design incorporates scalability and flexibility and can accommodate a variety of upgrades and contributions. With its exceptional combination of experimental configuration, technical capabilities, and potential for transformative discoveries, LBNE promises to be a vital facility for the field of particle physics worldwide, providing physicists from around the globe with opportunities to collaborate in a twenty to thirty year program of exciting science. In this document we provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess.Comment: Major update of previous version. This is the reference document for LBNE science program and current status. Chapters 1, 3, and 9 provide a comprehensive overview of LBNE's scientific objectives, its place in the landscape of neutrino physics worldwide, the technologies it will incorporate and the capabilities it will possess. 288 pages, 116 figure

    The balance of reproducibility, sensitivity, and specificity of lists of differentially expressed genes in microarray studies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Reproducibility is a fundamental requirement in scientific experiments. Some recent publications have claimed that microarrays are unreliable because lists of differentially expressed genes (DEGs) are not reproducible in similar experiments. Meanwhile, new statistical methods for identifying DEGs continue to appear in the scientific literature. The resultant variety of existing and emerging methods exacerbates confusion and continuing debate in the microarray community on the appropriate choice of methods for identifying reliable DEG lists.</p> <p>Results</p> <p>Using the data sets generated by the MicroArray Quality Control (MAQC) project, we investigated the impact on the reproducibility of DEG lists of a few widely used gene selection procedures. We present comprehensive results from inter-site comparisons using the same microarray platform, cross-platform comparisons using multiple microarray platforms, and comparisons between microarray results and those from TaqMan – the widely regarded "standard" gene expression platform. Our results demonstrate that (1) previously reported discordance between DEG lists could simply result from ranking and selecting DEGs solely by statistical significance (<it>P</it>) derived from widely used simple <it>t</it>-tests; (2) when fold change (FC) is used as the ranking criterion with a non-stringent <it>P</it>-value cutoff filtering, the DEG lists become much more reproducible, especially when fewer genes are selected as differentially expressed, as is the case in most microarray studies; and (3) the instability of short DEG lists solely based on <it>P</it>-value ranking is an expected mathematical consequence of the high variability of the <it>t</it>-values; the more stringent the <it>P</it>-value threshold, the less reproducible the DEG list is. These observations are also consistent with results from extensive simulation calculations.</p> <p>Conclusion</p> <p>We recommend the use of FC-ranking plus a non-stringent <it>P </it>cutoff as a straightforward and baseline practice in order to generate more reproducible DEG lists. Specifically, the <it>P</it>-value cutoff should not be stringent (too small) and FC should be as large as possible. Our results provide practical guidance to choose the appropriate FC and <it>P</it>-value cutoffs when selecting a given number of DEGs. The FC criterion enhances reproducibility, whereas the <it>P </it>criterion balances sensitivity and specificity.</p

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    Background: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. Methods: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. Findings: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96–1·28). Interpretation: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. Funding: National Institute for Health Research Health Services and Delivery Research Programme

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    BACKGROUND: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. METHODS: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. FINDINGS: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96-1·28). INTERPRETATION: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. FUNDING: National Institute for Health Research Health Services and Delivery Research Programme
    corecore