188 research outputs found

    SepA Enhances Shigella Invasion of Epithelial Cells by Degrading Alpha-1 Antitrypsin and Producing a Neutrophil Chemoattractant

    Get PDF
    Shigella spp. are highly adapted pathogens that cause bacillary dysentery in human and nonhuman primates. An unusual feature of Shigella pathogenesis is that this organism invades the colonic epithelia from the basolateral pole. Therefore, it has evolved the ability to disrupt the intestinal epithelial barrier to reach the basolateral surface. We have shown previously that the secreted serine protease A (SepA), which belongs to the family of serine protease autotransporters of Enterobacteriaceae, is responsible for the initial destabilization of the intestinal epithelial barrier that facilitates Shigella invasion. However, the mechanisms used by SepA to regulate this process remain unknown. To investigate the protein targets cleaved by SepA in the intestinal epithelium, we incubated a sample of homogenized human colon with purified SepA or with a catalytically inactive mutant of this protease. We discovered that SepA targets an array of 18 different proteins, including alpha-1 antitrypsin (AAT), a major circulating serine proteinase inhibitor in humans. In contrast to other serine proteases, SepA cleaved AAT without forming an inhibiting complex, which resulted in the generation of a neutrophil chemoattractant. We demonstrated that the products of the AAT-SepA reaction induce a mild but significant increase in neutrophil transepithelial migration in vitro. Moreover, the presence of AAT during Shigella infection stimulated neutrophil migration and dramatically enhanced the number of bacteria invading the intestinal epithelium in a SepA-dependent manner. We conclude that by cleaving AAT, SepA releases a chemoattractant that promotes neutrophil migration, which in turn disrupts the intestinal epithelial barrier to enable Shigella invasion. IMPORTANCE Shigella is the second leading cause of diarrheal death globally. In this study, we identified the host protein targets of SepA, Shigella\u27s major protein secreted in culture. We demonstrated that by cleaving AAT, a serine protease inhibitor important to protect surrounding tissue at inflammatory sites, SepA releases a neutrophil chemoattractant that enhances Shigella invasion. Moreover, SepA degraded AAT without becoming inhibited by the cleaved product, and SepA catalytic activity was enhanced at higher concentrations of AAT. Activation of SepA by an excess of AAT may be physiologically relevant at the early stages of Shigella infection, when the amount of synthesized SepA is very low compared to the concentration of AAT in the intestinal lumen. This observation may also help to explain the adeptness of Shigella infectivity at low dose, despite the requirement of reaching the basolateral side to invade and colonize the colonic epithelium

    Risk of Brain Tumors in Children and Susceptibility to Organophosphorus Insecticides: The Potential Role of Paraoxonase (PON1)

    Get PDF
    Prior research suggests that childhood brain tumors (CBTs) may be associated with exposure to pesticides. Organophosphorus insecticides (OPs) target the developing nervous system, and until recently, the most common residential insecticides were chlorpyrifos and diazinon, two OPs metabolized in the body through the cytochrome P450/paraoxonase 1 (PON1) pathway. To investigate whether two common PON1 polymorphisms, C-108T and Q192R, are associated with CBT occurrence, we conducted a population-based study of 66 cases and 236 controls using DNA from neonatal screening archive specimens in Washington State, linked to interview data. The risk of CBT was nonsignificantly increased in relation to the inefficient PON1 promoter allele [per PON1(-108T) allele, relative to PON1(-108CC): odds ratio (OR) = 1.4; 95% confidence interval (CI), 1.0–2.2; p-value for trend = 0.07]. Notably, this association was strongest and statistically significant among children whose mothers reported chemical treatment of the home for pests during pregnancy or childhood (per PON1(-108T) allele: among exposed, OR = 2.6; 95% CI, 1.2–5.5; among unexposed, OR = 0.9; 95% CI, 0.5–1.6) and for primitive neuroectodermal tumors (per PON1(-108T) allele: OR = 2.4; 95% CI, 1.1–5.4). The Q192R polymorphism, which alters the structure of PON1 and influences enzyme activity in a substrate-dependent manner, was not associated with CBT risk, nor was the PON1(C-108T/Q192R) haplotype. These results are consistent with an inverse association between PON1 levels and CBT occurrence, perhaps because of PON1’s ability to detoxify OPs common in children’s environments. Larger studies that measure plasma PON1 levels and incorporate more accurate estimates of pesticide exposure will be required to confirm these observations

    Birth characteristics and childhood carcinomas

    Get PDF
    BACKGROUND: Carcinomas in children are rare and have not been well studied. METHODS: We conducted a population-based case–control study and examined associations between birth characteristics and childhood carcinomas diagnosed from 28 days to 14 years during 1980–2004 using pooled data from five states (NY, WA, MN, TX, and CA) that linked their birth and cancer registries. The pooled data set contained 57 966 controls and 475 carcinoma cases, including 159 thyroid and 126 malignant melanoma cases. We used unconditional logistic regression to calculate odds ratios (ORs) and 95% confidence intervals (CIs). RESULTS: White compared with ‘other' race was positively associated with melanoma (OR=3.22, 95% CI 1.33–8.33). Older maternal age increased the risk for melanoma (OR(per 5-year age increase)=1.20, 95% CI 1.00–1.44), whereas paternal age increased the risk for any carcinoma (OR=1.10(per 5-year age increase), 95% CI 1.01–1.20) and thyroid carcinoma (OR(per 5-year age increase)=1.16, 95% CI 1.01–1.33). Gestational age <37 vs 37–42 weeks increased the risk for thyroid carcinoma (OR=1.87, 95% CI 1.07–3.27). Plurality, birth weight, and birth order were not significantly associated with childhood carcinomas. CONCLUSION: This exploratory study indicates that some birth characteristics including older parental age and low gestational age may be related to childhood carcinoma aetiology

    Reproductive outcomes in male childhood cancer survivors: a linked cancer-birth registry analysis

    Get PDF
    OBJECTIVE: Compare the risk of reproductive and infant outcomes between male childhood cancer survivors and a population-based comparison group. DESIGN: Retrospective cohort study. SETTING: 4 U.S. regions. PARTICIPANTS: Cancer registries identified males <20 years old diagnosed with cancer 1973-2000. Linked birth certificates identified first subsequent live offspring (n=470). Comparison subjects were identified from remaining birth certificates, frequency-matched on year and age at fatherhood, and race/ethnicity (n=4150). MAIN EXPOSURE: Cancer diagnosis prior to age 20. OUTCOME MEASURES: Pregnancy and infant outcomes identified from birth certificates. RESULTS: Compared with infants born to unaffected males, offspring of cancer survivors had a borderline risk of birth weight <2500 g (RR 1.43, 95% CI 0.99-2.05), with risk associated most strongly with younger age of cancer diagnosis and exposure to any chemotherapy (RR 1.96, 95% CI 1.22-3.17) or radiotherapy (RR 1.95, 95% CI 1.14-3.35). However, they were not at risk of being born prematurely, small for gestational age, having malformations or an altered male:female sex ratio. Overall, female partners of male survivors were not more likely to have maternal complications recorded on birth records versus the comparison group. However, preeclampsia was associated with some cancers, especially central nervous system tumors (RR 3.36, 95% CI 1.63-6.90). CONCLUSIONS: Most pregnancies resulting in live births among partners of male childhood cancer survivors were not at significantly greater risk of complications versus comparison subjects. The possibility of a paternal component affected by prior cancer history influencing predisposition towards some adverse perinatal outcomes merits further investigation

    Pregnancy outcomes in female childhood and adolescent cancer survivors: a linked cancer-birth registry analysis

    Get PDF
    Objective: To compare birth outcomes among childhood and adolescent female cancer survivors who subsequently bear children, relative to those of women without cancer history. Design: Retrospective cohort study. Setting: 4 U.S. regions. Participants: Cancer registries identified girls <20 years, diagnosed with cancer 1973-2000. Linked birth records identified first live births after diagnosis (n=1898). Comparison subjects were selected from birth records (n=14278). Cervical/genital tract cancer cases were analyzed separately. Main Exposure: Cancer diagnosis <20 years. Outcome Measures: Infant low birth weight, preterm delivery, sex ratio, malformations, mortality, delivery method; maternal diabetes, anemia, preeclampsia. Results: Childhood cancer survivors’ infants were more likely to be preterm (relative risk [RR] 1.54, 95% CI 1.30-1.83) and weigh <2500 g (RR 1.31, 95% CI 1.10-1.57). For cervical/genital cancer patients’ offspring, estimates were 1.33 (95% CI 1.13, 1.56), and 1.29 (95% CI 1.10-1.53), respectively. There were no increased risks of malformations, infant death, or altered sex ratio, suggesting no increased germ cell mutagenicity. In exploratory analysis, bone cancer survivors had an increased risk of diabetes (RR 4.92, 95% CI 1.60-15.13), and anemia was more common among brain tumor survivors (RR 3.05, 95% CI 1.16-7.98) and childhood cancer survivors with initial treatment of chemotherapy only (RR 2.45, 95% CI 1.16-5.17). Conclusions: Infants of female childhood and adolescent cancer patients were not at increased risk of malformations or death. Increased occurrence of preterm delivery and low birth weight suggest close monitoring is warranted. Increased diabetes and anemia among sub-groups have not been reported, suggesting areas for study

    Application benchmark results for Big Red, an IBM e1350 BladeCenter Cluster

    Get PDF
    The purpose of this report is to present the results of benchmark tests with Big Red, an IBM e1350 BladeCenter Cluster. This report is particularly focused on providing details of system architecture and test run results in detail to allow for analysis in other reports and comparison with other systems, rather than presenting such analysis here

    Costs of early detection systems for epidemic malaria in highland areas of Kenya and Uganda

    Get PDF
    BACKGROUND: Malaria epidemics cause substantial morbidity and mortality in highland areas of Africa. The costs of detecting and controlling these epidemics have not been explored adequately in the past. This study presents the costs of establishing and running an early detection system (EDS) for epidemic malaria in four districts in the highlands of Kenya and Uganda. METHODS: An economic costing was carried out from the health service provider's perspective in both countries. Staff time for data entry and processing, as well as supervising and coordinating EDS activities at district and national levels was recorded and associated opportunity costs estimated. A threshold analysis was carried out to determine the number of DALYs or deaths that would need to be averted in order for the EDS to be considered cost-effective. RESULTS: The total costs of the EDS per district per year ranged between US$ 14,439 and 15,512. Salaries were identified as major cost-drivers, although their relative contribution to overall costs varied by country. Costs of relaying surveillance data between facilities and district offices (typically by hand) were also substantial. Data from Uganda indicated that 4% or more of overall costs could potentially be saved by switching to data transfer via mobile phones. Based on commonly used thresholds, 96 DALYs in Uganda and 103 DALYs in Kenya would need to be averted annually in each district for the EDS to be considered cost-effective. CONCLUSION: Results from this analysis suggest that EDS are likely to be cost-effective. Further studies that include the costs and effects of the health systems' reaction prompted by EDS will need to be undertaken in order to obtain comprehensive cost-effectiveness estimates

    JWST MIRI flight performance: The Medium-Resolution Spectrometer

    Full text link
    The Medium-Resolution Spectrometer (MRS) provides one of the four operating modes of the Mid-Infrared Instrument (MIRI) on board the James Webb Space Telescope (JWST). The MRS is an integral field spectrometer, measuring the spatial and spectral distributions of light across the 5-28 μm\mu m wavelength range with a spectral resolving power between 3700-1300. We present the MRS's optical, spectral, and spectro-photometric performance, as achieved in flight, and we report on the effects that limit the instrument's ultimate sensitivity. The MRS flight performance has been quantified using observations of stars, planetary nebulae, and planets in our Solar System. The precision and accuracy of this calibration was checked against celestial calibrators with well-known flux levels and spectral features. We find that the MRS geometric calibration has a distortion solution accuracy relative to the commanded position of 8 mas at 5 μm\mu m and 23 mas at 28 μm\mu m. The wavelength calibration is accurate to within 9 km/sec at 5 μm\mu m and 27 km/sec at 28 μm\mu m. The uncertainty in the absolute spectro-photometric calibration accuracy was estimated at 5.6 +- 0.7 %. The MIRI calibration pipeline is able to suppress the amplitude of spectral fringes to below 1.5 % for both extended and point sources across the entire wavelength range. The MRS point spread function (PSF) is 60 % broader than the diffraction limit along its long axis at 5 μm\mu m and is 15 % broader at 28 μm\mu m. The MRS flight performance is found to be better than prelaunch expectations. The MRS is one of the most subscribed observing modes of JWST and is yielding many high-profile publications. It is currently humanity's most powerful instrument for measuring the mid-infrared spectra of celestial sources and is expected to continue as such for many years to come.Comment: 16 pages, 21 figure

    Determinants of the accuracy of rapid diagnostic tests in malaria case management: evidence from low and moderate transmission settings in the East African highlands

    Get PDF
    BACKGROUND: The accuracy of malaria diagnosis has received renewed interest in recent years due to changes in treatment policies in favour of relatively high-cost artemisinin-based combination therapies. The use of rapid diagnostic tests (RDTs) based on histidine-rich protein 2 (HRP2) synthesized by Plasmodium falciparum has been widely advocated to save costs and to minimize inappropriate treatment of non-malarial febrile illnesses. HRP2-based RDTs are highly sensitive and stable; however, their specificity is a cause for concern, particularly in areas of intense malaria transmission due to persistence of HRP2 antigens from previous infections. METHODS: In this study, 78,454 clinically diagnosed malaria patients were tested using HRP2-based RDTs over a period of approximately four years in four highland sites in Kenya and Uganda representing hypoendemic to mesoendemic settings. In addition, the utility of the tests was evaluated in comparison with expert microscopy for disease management in 2,241 subjects in two sites with different endemicity levels over four months. RESULTS: RDT positivity rates varied by season and year, indicating temporal changes in accuracy of clinical diagnosis. Compared to expert microscopy, the sensitivity, specificity, positive predictive value and negative predictive value of the RDTs in a hypoendemic site were 90.0%, 99.9%, 90.0% and 99.9%, respectively. Corresponding measures at a mesoendemic site were 91.0%, 65.0%, 71.6% and 88.1%. Although sensitivities at the two sites were broadly comparable, levels of specificity varied considerably between the sites as well as according to month of test, age of patient, and presence or absence of fever during consultation. Specificity was relatively high in older age groups and increased towards the end of the transmission season, indicating the role played by anti-HRP2 antibodies. Patients with high parasite densities were more likely to test positive with RDTs than those with low density infections. CONCLUSION: RDTs may be effective when used in low endemicity situations, but high false positive error rates may occur in areas with moderately high transmission. Reports on specificity of RDTs and cost-effectiveness analyses on their use should be interpreted with caution as there may be wide variations in these measurements depending upon endemicity, season and the age group of patients studied
    corecore