817 research outputs found

    Spontaneous cytokine production in children according to biological characteristics and environmental exposures.

    Get PDF
    BACKGROUND: Environmental factors are likely to have profound effects on the development of host immune responses, with serious implications for infectious diseases and inflammatory disorders such as asthma. OBJECTIVE: This study was designed to investigate the effects of environmental exposures on the cytokine profile of children. METHODS: The study involved measurement of T helper (Th) 1 (interferon-gamma), 2 [interleukin (IL)-5 and IL-13], and the regulatory cytokine IL-10 in unstimulated peripheral blood leukocytes from 1,376 children 4-11 years of age living in a poor urban area of the tropics. We also assessed the impact of environmental exposures in addition to biological characteristics recorded at the time of blood collection and earlier in childhood (0-3 years before blood collection). RESULTS: The proportion of children producing IL-10 was greater among those without access to drinking water [p < 0.05, chi-square test, odds ratio (OR) = 1.67]. The proportion of children producing IL-5 and IL-10 (OR = 10.76) was significantly greater in households that had never had a sewage system (p < 0.05, trend test). CONCLUSIONS: These data provide evidence for the profound effects of environmental exposures in early life as well as immune homeostasis in later childhood. Decreased hygiene (lack of access to clean drinking water and sanitation) in the first 3 years of life is associated with higher spontaneous IL-10 production up to 8 years later in life

    Applied immuno-epidemiological research: an approach for integrating existing knowledge into the statistical analysis of multiple immune markers.

    Get PDF
    BACKGROUND: Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. RESULTS: We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. CONCLUSION: The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes

    Data production models for the CDF experiment

    Get PDF
    The data production for the CDF experiment is conducted on a large Linux PC farm designed to meet the needs of data collection at a maximum rate of 40 MByte/sec. We present two data production models that exploits advances in computing and communication technology. The first production farm is a centralized system that has achieved a stable data processing rate of approximately 2 TByte per day. The recently upgraded farm is migrated to the SAM (Sequential Access to data via Metadata) data handling system. The software and hardware of the CDF production farms has been successful in providing large computing and data throughput capacity to the experiment.Comment: 8 pages, 9 figures; presented at HPC Asia2005, Beijing, China, Nov 30 - Dec 3, 200

    30-Day Postoperative Morbidity of Emergency Surgery for Obstructive Right- and Left-Sided Colon Cancer in Obese Patients: A Multicenter Cohort Study of the French Surgical Association

    Get PDF
    BACKGROUND: Emergency surgery impairs postoperative outcomes in colorectal cancer patients. No study has assessed the relationship between obesity and postoperative results in this setting. OBJECTIVE: To compare the results of emergency surgery for obstructive colon cancer (OCC) in an obese patient population with those in overweight and normal weight patient groups. METHODS: From 2000 to 2015, patients undergoing emergency surgery for OCC in French surgical centers members of the French National Surgical Association were included. Three groups were defined: normal weight (body mass index [BMI] &lt; 25.0 kg/m2), overweight (BMI 25.0-29.9 kg/m2), and obese (BMI ≥30.0 kg/m2). RESULTS: Of 1,241 patients, 329 (26.5%) were overweight and 143 (11.5%) were obese. Obese patients had significantly higher American society of anesthesiologists score, more cardiovascular comorbidity and more hemodynamic instability at presentation. Overall postoperative mortality and morbidity were 8 and 51%, respectively, with no difference between the 3 groups. For obese patients with left-sided OCC, stoma-related complications were significantly increased (8 vs. 5 vs. 15%, p = 0.02). CONCLUSION: Compared with lower BMI patients, obese patients with OCC had a more severe presentation at admission but similar surgical management. Obesity did not increase 30-day postoperative morbidity except stoma-related complications for those with left-sided OCC

    Data processing model for the CDF experiment

    Get PDF
    The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets. The design of the processing control system faces strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required.Comment: 12 pages, 10 figures, submitted to IEEE-TN

    Search for Squarks and Gluinos in Events Containing Jets and a Large Imbalance in Transverse Energy

    Get PDF
    Using data corresponding to an integrated luminosity of 79 pb-1, D0 has searched for events containing multiple jets and large missing transverse energy in pbar-p collisions at sqrt(s)=1.8 TeV at the Fermilab Tevatron collider. Observing no significant excess beyond what is expected from the standard model, we set limits on the masses of squarks and gluinos and on the model parameters m_0 and m_1/2, in the framework of the minimal low-energy supergravity models of supersymmetry. For tan(beta) = 2 and A_0 = 0, with mu < 0, we exclude all models with m_squark < 250 GeV/c^2. For models with equal squark and gluino masses, we exclude m < 260 GeV/c^2.Comment: 10 pages, 3 figures, Submitted to PRL, Fixed typo on page bottom of p. 6 (QCD multijet background is 35.4 events

    A measurement of the W boson mass using large rapidity electrons

    Get PDF
    We present a measurement of the W boson mass using data collected by the D0 experiment at the Fermilab Tevatron during 1994--1995. We identify W bosons by their decays to e-nu final states where the electron is detected in a forward calorimeter. We extract the W boson mass, Mw, by fitting the transverse mass and transverse electron and neutrino momentum spectra from a sample of 11,089 W -> e nu decay candidates. We use a sample of 1,687 dielectron events, mostly due to Z -> ee decays, to constrain our model of the detector response. Using the forward calorimeter data, we measure Mw = 80.691 +- 0.227 GeV. Combining the forward calorimeter measurements with our previously published central calorimeter results, we obtain Mw = 80.482 +- 0.091 GeV

    Sampling strategies to measure the prevalence of common recurrent infections in longitudinal studies

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Measuring recurrent infections such as diarrhoea or respiratory infections in epidemiological studies is a methodological challenge. Problems in measuring the incidence of recurrent infections include the episode definition, recall error, and the logistics of close follow up. Longitudinal prevalence (LP), the proportion-of-time-ill estimated by repeated prevalence measurements, is an alternative measure to incidence of recurrent infections. In contrast to incidence which usually requires continuous sampling, LP can be measured at intervals. This study explored how many more participants are needed for infrequent sampling to achieve the same study power as frequent sampling.</p> <p>Methods</p> <p>We developed a set of four empirical simulation models representing low and high risk settings with short or long episode durations. The model was used to evaluate different sampling strategies with different assumptions on recall period and recall error.</p> <p>Results</p> <p>The model identified three major factors that influence sampling strategies: (1) the clustering of episodes in individuals; (2) the duration of episodes; (3) the positive correlation between an individual's disease incidence and episode duration. Intermittent sampling (e.g. 12 times per year) often requires only a slightly larger sample size compared to continuous sampling, especially in cluster-randomized trials. The collection of period prevalence data can lead to highly biased effect estimates if the exposure variable is associated with episode duration. To maximize study power, recall periods of 3 to 7 days may be preferable over shorter periods, even if this leads to inaccuracy in the prevalence estimates.</p> <p>Conclusion</p> <p>Choosing the optimal approach to measure recurrent infections in epidemiological studies depends on the setting, the study objectives, study design and budget constraints. Sampling at intervals can contribute to making epidemiological studies and trials more efficient, valid and cost-effective.</p
    corecore