269 research outputs found
Effect of selective gastric residual monitoring on enteral intake in preterm infants
Objective: Prefeed gastric residuals (GRs) monitoring has been correlated with an increased time to reach full feeds and longer parenteral nutrition without beneficial effect on necrotizing enterocolitis (NEC) occurrence. We aimed to assess effects of a new local protocol to provide for the selective evaluation of GRs excluding their routine monitoring. Methods: We carried out a retrospective study based on a âbefore and afterâ design in a cohort of infants born at 23+0â31+6 weeks of gestation. The primary outcome was the age at full enteral feeding (150 mL/kg/d). Secondary outcomes included age at regaining of birth weight, and evaluation of Z-scores of weight, length, and head circumference at discharge. Results: We studied 49 infants in the selective GR group and 59 in the routine GR group. Age at full (150 mL/kg) enteral feeding (17.8 ± 10.1 vs. 22.9 ± 10.5 days, P = 0.017) and regaining of birth weight (11.1 ± 3.0 vs. 12.5 ± 3.5 days, P = 0.039) were lower while the Z-scores of weight at discharge (-1.10 ± 0.83 vs. -1.60 ± 1.45, P = 0.040) were higher in infants in the selective GR group in comparison with infants in the routine GR group. Conclusions: Selective monitoring of GRs decreased age at full enteral feeding and at regaining of birth weight and induced better Z-scores of weight at discharge in comparison with routine GR monitoring in a cohort of extremely preterm infants without increasing the incidence of NEC. Omitting prefeed GRs monitoring in clinical practice seems reasonable
Immune effects of four Fusarium-Toxins ( FB1, ZEA, NIV, DON) on the proliferation of Jurkat cells and porcine lymphocytes: in vitro study.
Fusarium toxins are secondary metabolites produced by fungi belonging to Fusarium spp., commonly found as contaminants in products of vegetable origin, particularly in cereal grains, in regions with temperate climate in Europe, America and Asia. Numerous toxic effects are attributed to mycotoxins both in humans and animals, such as mutagenic, cencerogenic and teratogenic properties. Moreover, some of them can alter normal immune responses when they are present in food at levels lower than those necessary to cause the symptoms of mycotoxicosis. In the present work, we evaluated the immunomodulatory effects of four Fusarium toxins (FB1, ZEA, NIV, DON) using two different experimental models: Jurkat cells and porcine lymphocytes. In addition to the activity of single mycotoxins, we evaluated possible interactions between Fusarium toxins to reproduce experimental conditions in vitro as near as possible to field conditions. Our results revealed the immunomodulatory properties of the
mycotoxin objects of the current study. They also underline the interest in studying possible interactions among different
mycotoxins, particularly among those mainly present in food such as Fusarium mycotoxins, not only regarding their toxicodinamic aspect but also to define tolerable maximum levels of Fusarium toxins in food
Platelet Count and Volume and Pharmacological Closure with Paracetamol of Ductus Arteriosus in Preterm Infants
Background: Low platelet count might promote resistance to pharmacological closure with indomethacin and ibuprofen of a hemodynamically significant patent ductus arteriosus (hsPDA). However, no studies have investigated if this occurs with paracetamol. Methods: We retrospectively assessed the correlation between platelet count, mean platelet volume (MPV), and plateletcrit (PCT), as well as the effectiveness of paracetamol in closing hsPDA in infants born at 23+0 â31+6 weeks of gestation who were treated with 15 mg/kg/6 h of i.v. paracetamol for 3 days. Results: We studied 79 infants: 37 (47%) Had closure after a course of paracetamol and 42 (53%) did not. Platelet count and PCT did not correlate with paracetamol success or failure in closing hsPDA, while MPV was lower at birth (10.7 ± 1.4 vs. 9.5 ± 1.1; p < 0.001) and prior to starting therapy (11.7 ± 1.9 vs. 11.0 ± 1.6; p = 0.079) in refractory infants. Regression analysis confirmed that the low MVP measured prior to starting the treatment increased the risk of hsPDA paracetamol closure failure (OR 1.664, 95% CI 1.153â2.401). Conclusions: The greater MPV correlated positively with the effectiveness of paracetamol in closing hsPDA, while platelet count and PCT did not influence closure rates. Additional studies are needed to confirm our results
Enteral and Parenteral Treatment with Caffeine for Preterm Infants in the Delivery Room: A Randomised Trial
Background: Early treatment with caffeine in the delivery room (DR) has been proposed to decrease the need for mechanical ventilation (MV) by limiting episodes of apnoea and improving respiratory mechanics in preterm infants. Our aim was to verify the hypothesis that intravenous or enteral administration of caffeine can be performed in the preterm infant in the DR. Methods: Infants with 25±0â29±6 weeks of gestational age were enrolled and randomised to receive 20 mg/kg of caffeine citrate intravenously, via the umbilical vein, or enterally, through an orogastric tube, within 10 min of birth. Caffeine blood level was measured at 60 ± 15 min after administration and 60 ± 15 min before the next dose (5 mg/kg). The primary endpoint was evaluation of the success rate of intravenous and enteral administration of caffeine in the DR. Results: Nineteen patients were treated with intravenous caffeine and 19 with enteral caffeine. In all patients the procedure was successfully performed. Peak blood level of caffeine 60 ± 15 min after administration in the DR was found to be below the therapeutic range (5 ”g/mL) in 25 % of samples and above the therapeutic range in 3%. Blood level of caffeine 60 ± 15 min before administration of the second dose was found to be below the therapeutic range in 18% of samples. Conclusions: Intravenous and enteral administration of caffeine can be performed in the DR without interfering with infantsâ postnatal assistance. Some patients did not reach the therapeutic range, raising the question of which dose is the most effective to prevent MV. Clinical Trial Registration: ClinicalTrials.gov identifier NCT04044976; EudraCT number 2018-003626-91
Seroprevalence of ehrlichia spp., anaplasma spp., borrelia burgdorferi sensu lato, and dirofilaria immitis in stray dogs, from 2016 to 2019, in southern Italy
Canine vector-borne diseases (CVBD) are an important and emerging health concern for humans and animals worldwide. The purpose of the presented study was to assess, from 2016 to 2019, the seroprevalence of CVBD agents and clarify the epidemiology of tick-borne disease in stray dogs living in the Campania Region, Southern Italy. For this purpose, blood samples were collected from January 2016 to December 2019 from 1023 dogs in authorized kennels located in the five municipalities of the Campania Region. SNAPÂź 4DXÂź from IDEXXÂź Laboratories was used for detection of Ehrlichia spp., Anaplasma spp., Borrelia burgdorferi sensu lato (s.l.), and Dirofilaria immitis antibodies. The overall seroprevalence of CVBD in stray dogs was 19.6% (95% Confidence Intervals (CI): 17.2â22.8%; 201/1023). The most common pathogen was Ehrlichia spp., with a percentage of positivity of 16.03%, followed by Anaplasma spp. with 7.8%. B. burgdorferi s.l. and D. immitis were detected in only 0.2% of dogs; co-infection was detected in 4.5% of stray dogs tested. No link was detected between the gender, age, location, and CVBD seropositivity, except for Ehrlichia spp. for which location (Avellino Province; p = 0.007) and gender (male, p = 0.002) were risk factors for seropositivity. Our results demonstrated that animals are exposed to at least one of the four etiological agents (Ehrlichia spp., Anaplasma spp. Borrelia burgdorferi s.l., and Dirofilaria immitis) transmitted by vectors. Finally, this study highlighted the utility of serological monitoring in stray dogs, housed in kennels, given the threat posed by CVBD to animals and the zoonotic implications of these etiological agents and their vectors on human health
Vascular calcification progression modulates the risk associated with vascular calcification burden in incident to dialysis patients
Background: It is estimated that chronic kidney disease (CKD) accounts globally for 5 to 10 million deaths annually, mainly due to cardiovascular (CV) diseases. Traditional as well as non-traditional CV risk factors such as vascular calcification are believed to drive this disproportionate risk burden. We aimed to investigate the association of coronary artery calcification (CAC) progression with all-cause mortality in patients new to hemodialysis (HD). Methods: Post hoc analysis of the Independent study (NCT00710788). At study inception and after 12 months of follow-up, 414 patients underwent computed tomography imaging for quantification of CAC via the Agatston methods. The square root method was used to assess CAC progression (CACP), and survival analyses were used to test its association with mortality. Results: Over a median follow-up of 36 months, 106 patients died from all causes. Expired patients were older, more likely to be diabetic or to have experienced an atherosclerotic CV event, and exhibited a significantly greater CAC burden (p = 0.002). Survival analyses confirmed an independent association of CAC burden (hazard ratio: 1.29; 95% confidence interval: 1.17â1.44) and CACP (HR: 5.16; 2.61â10.21) with all-cause mortality. CACP mitigated the risk associated with CAC burden (p = 0.002), and adjustment for calcium-free phosphate binder attenuated the strength of the link between CACP and mortality. Conclusions: CAC burden and CACP predict mortality in incident to dialysis patients. However, CACP reduced the risk associated with baseline CAC, and calcium-free phosphate binders attenuated the association of CACP and outcomes, suggesting that CACP modulation may improve survival in this population. Future endeavors are needed to confirm whether drugs or kidney transplantation may attenuate CACP and improve survival in HD patients
Potential Approaches Versus Approved or Developing Chronic Myeloid Leukemia Therapy.
Tyrosine kinase inhibitors (TKIs) have revolutionized the treatment of patients with chronic myeloid leukemia (CML). However, continued use of these inhibitors has contributed to the increase in clinical resistance and the persistence of resistant leukemic stem cells (LSCs). So, there is an urgent need to introduce additional targeted and selective therapies to eradicate quiescent LSCs, and to avoid the relapse and disease progression. Here, we focused on emerging BCR-ABL targeted and non-BCR-ABL targeted drugs employed in clinical trials and on alternative CML treatments, including antioxidants, oncolytic virus, engineered exosomes, and natural products obtained from marine organisms that could pave the way for new therapeutic approaches for CML patients
Aujeszkyâs disease in southâItalian wild boars (Sus Scrofa): A serological survey
Aujeszkyâs disease (AD, pseudorabies) is a viral disease of suids caused by Suid Herpesvirus 1 (SHVâ1) also referred as Aujeszkyâs disease virus (ADV) or Pseudorabies virus (ADV). Domestic pig and Wild boar (Sus scrofa) are the natural host, but many species can be infected with ADV. The aim of our study was to evaluate seroprevalence of AD in wild boar hunted in the Campania Region, during the 2016â2017 hunting season. A total of 503 serum samples from wild boars hunted in the provinces of Campania Region (Southern Italy) were collected and were tested for antibody against ADV using an AD, blocking ELISA assay. A Seroprevalence of 23.85% (120/503, 95% Confidence Interval (CI): 20.15â27.55) was found. Gender was not significantly associated with of ADV seropositivity (p > 0.05), while the presence of ADV antibodies was statistically associated with age (>36âmonth, p < 0.0001) and location (Avellino, p = 0.0161). Our prevalence values are like those obtained in 2010 in our laboratory (30.7%), demonstrating a constant circulation of ADV in the area
- âŠ