454 research outputs found

    Preliminary notes on invasion and proliferation of foodborne Listeria monocytogenes strains

    Get PDF
    In this study, virulence properties of L. monocytogenes strains isolated from food and food environments were evaluated. In particular, adhesion and invasion efficiencies were tested in a cell culture model (HeLa). Half of the isolates (9/18) exhibited a high invasion index. In particular, the strain isolated from smoked salmon had the highest invasion index. The remaining isolates showed an intermediate invasion index. All environmental isolates belonged to this group. Finally, no isolates revealed a low invasion index. Regarding intracellular growth, all tested isolates had a replication time between 2 and 6 hours. For this reason, they can be considered virulent. In spite of its capability to invade HeLa cells with a medium/high invasion index, a non-haemolytic rabbit isolate did not show any intracellular growth. In conclusion, differences in invasion efficiency and intracellular growth did not seem strictly related to the origin of the strains. Moreover, invasiveness of an organism is not the only requirement for establishing an infection. Virulence of L. monocytogenes also depends on ability to grow intracellularly and to spread from cell to cell. For these reasons, PCR detection of known virulence genes has the potential to gain additional insight into their pathogenic potential. A comprehensive comparative virulence characterization of different L. monocytogenes strains in studies that include tissue culture models and PCR detection of virulence genes will be necessary to investigate differences in human-pathogenic potentials among the subtypes of this bacterium

    The Experience in Nicaragua: Childhood Leukemia in Low Income Countries—The Main Cause of Late Diagnosis May Be “Medical Delay”

    Get PDF
    Background. The event-free survival for pediatric leukemia in low-income Countries is much lower than in high-income countries. Late diagnosis, which is regarded as a contributing factor, may be due to “parental” or “medical” delay. Procedures. The present study analyses determinants of lag time from first symptoms to diagnosis of leukemia, comparing pediatric (0–16 years old) patients in two referral centers, one in Nicaragua and one in Italy. An observational retrospective study was conducted to assess factors influencing the time to diagnosis. Results. 81 charts of children diagnosed with acute myeloid leukemia or lymphoblastic leukemia were analyzed from each centre. Median lag time to diagnosis was higher in Nicaragua than in Italy (29 versus 14 days, P < 0.001) and it was mainly due to “physician delay” (16.5 versus 7 days, P < 0.001), whereas “patient delay” from symptoms to first medical assessment was similar in the two centers (7 versus 5 days, P = 0.27). Moreover, median lag time from symptoms to diagnosis was decreased in Nicaraguan districts were a specific training program upon childhood oncological diseases was carried out (20.5 versus 40 days, P = 0.0019). Conclusions. Our study shows that delay in diagnosis of childhood leukemia is mainly associated with “physician delay” and it may be overcome by programs of continuous medical education

    Oral Nutritional Supplementation in Children Treated for Cancer in Low- and Middle-Income Countries Is Feasible and Effective: the Experience of the Children's Hospital Manuel De Jesus Rivera "La Mascota" in Nicaragua.

    Get PDF
    Children with cancer are particularly vulnerable to malnutrition, which can affect their tolerance of chemotherapy and outcome. In Nicaragua approximately two-thirds of children diagnosed with cancer present with under-nutrition. A nutritional program for children with cancer has been developed at "La Mascota" Hospital. Results of this oral nutritional intervention including difficulties, benefits, and relevance for children treated for cancer in low- and middle-income countries are here reported and discussed

    Dexamethasone vs prednisone in induction treatment of pediatric ALL: results of the randomized trial AIEOP-BFM ALL 2000

    Get PDF
    Induction therapy for childhood acute lymphoblastic leukemia (ALL) traditionally includes prednisone; yet, dexamethasone may have higher antileukemic potency, leading to fewer relapses and improved survival. After a 7-day prednisone prephase, 3720 patients enrolled on trial Associazione Italiana di Ematologia e Oncologia Pediatrica and Berlin-Frankfurt-Münster (AIEOP-BFM) ALL 2000 were randomly selected to receive either dexamethasone (10 mg/m(2) per day) or prednisone (60 mg/m(2) per day) for 3 weeks plus tapering in induction. The 5-year cumulative incidence of relapse (± standard error) was 10.8 ± 0.7% in the dexamethasone and 15.6 ± 0.8% in the prednisone group (P &lt; .0001), showing the largest effect on extramedullary relapses. The benefit of dexamethasone was partially counterbalanced by a significantly higher induction-related death rate (2.5% vs 0.9%, P = .00013), resulting in 5-year event-free survival rates of 83.9 ± 0.9% for dexamethasone and 80.8 ± 0.9% for prednisone (P = .024). No difference was seen in 5-year overall survival (OS) in the total cohort (dexamethasone, 90.3 ± 0.7%; prednisone, 90.5 ± 0.7%). Retrospective analyses of predefined subgroups revealed a significant survival benefit from dexamethasone only for patients with T-cell ALL and good response to the prednisone prephase (prednisone good-response [PGR]) (dexamethasone, 91.4 ± 2.4%; prednisone, 82.6 ± 3.2%; P = .036). In patients with precursor B-cell ALL and PGR, survival after relapse was found to be significantly worse if patients were previously assigned to the dexamethasone arm. We conclude that, for patients with PGR in the large subgroup of precursor B-cell ALL, dexamethasone especially reduced the incidence of better salvageable relapses, resulting in inferior survival after relapse. This explains the lack of benefit from dexamethasone in overall survival that we observed in the total cohort except in the subset of T-cell ALL patients with PGR. This trial was registered at www.clinicaltrials.gov (BFM: NCT00430118, AIEOP: NCT00613457)

    Bone Marrow Stromal Cell Regeneration Profile in Treated B-Cell Precursor Acute Lymphoblastic Leukemia Patients:Association with MRD Status and Patient Outcome

    Get PDF
    SIMPLE SUMMARY: For the last 20 years, measurable residual disease (MRD) has proven to be a strong prognostic factor in B-cell precursor acute lymphoblastic leukemia (BCP-ALL). However, the effects of therapy on the bone marrow (BM) microenvironment and their potential relationship with MRD and patient outcome still remain to be evaluated. Here, we show that mesenchymal stem cells (MSC) and endothelial cells (EC) are constantly present at relatively low frequencies in normal BM and in most follow-up BM samples from treated BCP-ALL patients. Of note, their levels are independent of the MRD status. From the prognostic point of view, an increased percentage of EC among stromal cells (EC plus MSC) at day +78 of therapy was associated with shorter disease free survival (DFS), independently of the MRD status both in childhood and in adult BCP-ALL. Thus, an abnormally high EC/MSC distribution at day +78 of therapy emerges as an adverse prognostic factor, independent of MRD in BCP-ALL. ABSTRACT: For the last two decades, measurable residual disease (MRD) has become one of the most powerful independent prognostic factors in B-cell precursor acute lymphoblastic leukemia (BCP-ALL). However, the effect of therapy on the bone marrow (BM) microenvironment and its potential relationship with the MRD status and disease free survival (DFS) still remain to be investigated. Here we analyzed the distribution of mesenchymal stem cells (MSC) and endothelial cells (EC) in the BM of treated BCP-ALL patients, and its relationship with the BM MRD status and patient outcome. For this purpose, the BM MRD status and EC/MSC regeneration profile were analyzed by multiparameter flow cytometry (MFC) in 16 control BM (10 children; 6 adults) and 1204 BM samples from 347 children and 100 adult BCP-ALL patients studied at diagnosis (129 children; 100 adults) and follow-up (824 childhood samples; 151 adult samples). Patients were grouped into a discovery cohort (116 pediatric BCP-ALL patients; 338 samples) and two validation cohorts (74 pediatric BCP-ALL, 211 samples; and 74 adult BCP-ALL patients; 134 samples). Stromal cells (i.e., EC and MSC) were detected at relatively low frequencies in all control BM (16/16; 100%) and in most BCP-ALL follow-up samples (874/975; 90%), while they were undetected in BCP-ALL BM at diagnosis. In control BM samples, the overall percentage of EC plus MSC was higher in children than adults (p = 0.011), but with a similar EC/MSC ratio in both groups. According to the MRD status similar frequencies of both types of BM stromal cells were detected in BCP-ALL BM studied at different time points during the follow-up. Univariate analysis (including all relevant prognostic factors together with the percentage of stromal cells) performed in the discovery cohort was used to select covariates for a multivariate Cox regression model for predicting patient DFS. Of note, an increased percentage of EC (>32%) within the BCP-ALL BM stromal cell compartment at day +78 of therapy emerged as an independent unfavorable prognostic factor for DFS in childhood BCP-ALL in the discovery cohort—hazard ratio (95% confidence interval) of 2.50 (1–9.66); p = 0.05—together with the BM MRD status (p = 0.031). Further investigation of the predictive value of the combination of these two variables (%EC within stromal cells and MRD status at day +78) allowed classification of BCP-ALL into three risk groups with median DFS of: 3.9, 3.1 and 1.1 years, respectively (p = 0.001). These results were confirmed in two validation cohorts of childhood BCP-ALL (n = 74) (p = 0.001) and adult BCP-ALL (n = 40) (p = 0.004) treated at different centers. In summary, our findings suggest that an imbalanced EC/MSC ratio in BM at day +78 of therapy is associated with a shorter DFS of BCP-ALL patients, independently of their MRD status. Further prospective studies are needed to better understand the pathogenic mechanisms involved

    CRLF2 over-expression is a poor prognostic marker in children with high risk T-cell acute lymphoblastic leukemia

    Get PDF
    Pediatric T-ALL patients have a worse outcome compared to BCP-ALL patients and they could benefit from new prognostic marker identification. Alteration of CRLF2 gene, a hallmark correlated with poor outcome in BCP-ALL, has not been reported in T-ALL. We analyzed CRLF2 expression in 212 T-ALL pediatric patients enrolled in AIEOP-BFM ALL2000 study in Italian and German centers. Seventeen out of 120 (14.2%) Italian patients presented CRLF2 mRNA expression 5 times higher than the median (CRLF2-high); they had a significantly inferior event-free survival (41.2%±11.9 vs. 68.9%±4.6, p=0.006) and overall survival (47.1%±12.1 vs. 73.8%±4.3, p=0.009) and an increased cumulative incidence of relapse/resistance (52.9%±12.1 vs. 26.2%±4.3, p=0.007) compared to CRLF2-low patients. The prognostic value of CRLF2 over-expression was validated in the German cohort. Of note, CRLF2 over-expression was associated with poor prognosis in the high risk (HR) subgroup where CRLF2-high patients were more frequently allocated. Interestingly, although in T-ALL CRLF2 protein was localized mainly in the cytoplasm, in CRLF2-high blasts we found a trend towards a stronger TSLP-induced pSTAT5 response, sensitive to the JAK inhibitor Ruxolitinib. In conclusion, CRLF2 over-expression is a poor prognostic marker identifying a subset of HR T-ALL patients that could benefit from alternative therapy, potentially targeting the CRLF2 pathway
    corecore