210 research outputs found

    Effect Of Priming On Seed Vigor Of Wheat (Triticum Aestivum L.)

    Full text link
    Priming is a process that controls the process of hydration of seeds for the ongoing metabolic processes before germination. Research on priming was conducted at ICERI seed laboratory from May to September 2009 to evaluate the effect of different priming methods on wheat seed vigor. Physical properties and chemical composition of seed were evaluated before seeds were treated. The priming treatment were conducted by soaking 250 g of seed in 500 mL of solution for hydropriming and halopriming. Two seed lots of Nias and Dewata variety were subjected to heated and unheated distilled water for 12hours and subjected to KCl and CaCl2 at 10, 20, and 30 ppm and unprimed seed. The experiment were arranged in completely randomized design, replicated thrice. Vigor evaluation by observed seed germination, simultaneity growth, germination rate, seedling dry weight, electric conductivity of seed leakage and length of primary root. The results showed that highest germination, simultaneity growth, seedling dry weight, and length of primary root, were priming treatment with KCl 30 ppm and CaCl2 20 and 30 ppm. Priming with distilled water for 12 hours gave higher germination percentage and simultaneity growth

    HIV-1 Tropism Test Evaluation: Assessment and Clinical Implications

    Get PDF
    CCR5 and CXCR4 chemokines receptors are critical coreceptors for the binding of HIV to specific host cells. Guidelines recommend its assessment in case of virological failure or before prescription of CCR5 inhibitors. Strategies to assess viral tropism may be divided into phenotypic and genotypic assays; registrative trials of CCR5 inhibitors used phenotypic assay, but recently genotypic ones have been used in clinical practice. The presence of CXCR4 is increasing in naïve patients, with both acute and chronic HIV-1 infections; this coreceptor usage is associated with CD4 depletion. The assessment of viral tropism should be considered in every stage of HIV-1 infection

    Assessment of Effectiveness and Safety of Aspiration-Assisted Nephrostomic Access Sheaths in PCNL and Intrarenal Pressures Evaluation: A Systematic Review of the Literature

    Get PDF
    Background: Different suction-assisted nephrostomic sheaths have been developed for percutaneous nephrolithotomy (PCNL). Objectives: (1) To examine PCNL techniques performed with different aspiration-assisted sheaths (Clear Petra (R) sheath, Superperc, SuperminiPCNL, and a miniPCNL patented sheath), with specific regard to effectiveness and safety outcomes in adult and paediatric patients; (2) to extrapolate intrarenal pressure (IRP) data during these procedures. Methods: A systematic literature search was performed in accordance with PRISMA guidelines. Relevant articles up to 8 February 2024 were included. Results: Twenty-five studies were selected, thirteen retrospective and twelve prospective. The use of four different aspirating sheaths for miniPCNL was reported: Clear Petra (R) sheath, Superperc, SuperminiPCNL, and a miniPCNL patented sheath. Stone free rates (SFRs) across techniques ranged from 71.3% to 100%, and complication rates from 1.5% to 38.9%. Infectious complication rates varied from 0 to 27.8% and bleeding complication rates from 0 to 8.9%. Most complications were low grade ones. The trend among studies comparing aspiration- and non-aspiration-assisted miniPCNL was towards equivalent or better SFRs and lower overall infectious and bleeding complication rates in suction techniques. Operation time was consistently lower in suction procedures, with a mean shortening of the procedural time of 19 min. Seven studies reported IRP values during suction miniPCNL. Two studies reported satisfactory SFRs and adequate safety profiles in paediatric patient cohorts. Conclusions: MiniPCNL with aspirating sheaths appears to be safe and effective in both adult and paediatric patients. A trend towards a reduction of overall infectious and bleeding complications with respect to non-suction procedures is evident, with comparable or better SFRs and consistently shorter operative times. The IRP profile seems to be safe with the aid of aspirating sheaths. However, high quality evidence on this topic is still lacking

    Changes in pharyngeal aerobic microflora in oral breathers after palatal rapid expansion

    Get PDF
    BACKGROUND: The purpose of this study was to investigate in oral breathing children the qualitative and quantitative effects on aerobic and facultatively anaerobic oropharyngeal microflora of respiratory function improved by rapid palatal expansion (RPE). METHODS: In an open clinical trial, we studied 50 oral breathers, aged 8 to 14 years and suffering from both maxillary constriction and posterior cross-bite. At baseline, patients were examined by a single otorhinolaryngologist (ENT), confirming nasal obstruction in all subjects by posterior rhino-manometric test. Patients were evaluated three times by oropharyngeal swabs:1) at baseline (T = 0); 2) after palatal spreading out (T = 1); and 3) at the end of RPE treatment (T = 2). With regard to the microbiological aspect, the most common and potentially pathogenic oral microrganisms (i.e. Streptococcus pyogenes, Diplococcus pneumoniae, Staphylococcus aureus, Haemophilus spp, Branhamella catarrhalis, Klebsiella pneumoniae, Candida albicans) were specifically detected in proper culture plates, isolated colonies were identified by means of biochemical tests and counted by calibrated loop. The data were analyzed by means of the following tests: Chi-square test, Fisher's exact test and Wilcoxon's test. RESULTS: After the use of RME there was a statistically significant decrease of Staphylococcus aureus stock at CFU/mLat T1(P = 0.0005; Z = -3,455 by Wilcoxon Rank test) and T2 (P < 0.0001; Z = -4,512 by Wilcoxon Rank test) vs T0. No significant changes were found for the other examined microrganisms. CONCLUSION: Our data suggest that RPE therapy in oral breathers may strongly reduce the pathogenic aerobic and facultatively anaerobic microflora in the oral pharynx after a normalization of the upper airways function, and may reduce the risk of respiratory infections

    Risk factors for gram-negative bacterial infection of cardiovascular implantable electronic devices: Multicentre observational study (CarDINe Study)

    Get PDF
    Background: Infections of cardiovascular implantable electronic devices (CIED) are mainly due to Gram-positive bacteria (GPB). Data about Gram-negative bacteria CIED (GNB-CIED) infections are limited. This study aimed to investigate risk factors, clinical and diagnostic characteristics, and outcome of patients with GNB-CIED. Methods: A multicentre, international, retrospective, case-control-control study was performed on patients undergoing CIED implantation from 2015 to 2019 in 17 centres across Europe. For each patient diagnosed with GNB-CIED, one matching control with GPB-CIED infection and two matching controls without infection were selected. Results: A total of 236 patients were enrolled: 59 with GNB-CIED infection, 59 with GPB-CIED infection and 118 without infection. No between-group differences were found regarding clinical presentation, diagnostic and therapeutic management. A trend toward a higher rate of fluorodeoxyglucose positron emission computed tomography (FDG PET/CT) positivity was observed among patients with GNB than in those with GPB-CIED infection (85.7% vs. 66.7%; P = 0.208). Risk factors for GNB-CIED infection were Charlson Comorbidity Index Score (relative risk reduction, RRR = 1.211; P = 0.011), obesity (RRR = 5.122; P = 0.008), ventricular-pacing ventricular-sensing inhibited-response pacemaker implantation (RRR = 3.027; P = 0.006) and right subclavian vein site of implantation (RRR = 5.014; P = 0.004). At 180-day survival analysis, GNB-CIED infection was associated with increased mortality risk (HR = 1.842; P = 0.067). Conclusions: Obesity, high number of comorbidities and right subclavian vein implantation site were associated with increased risk of GNB-CIED infection. A prompt therapeutic intervention that may be guided using FDG PET/CT is suggested in patients with GNB-CIED infection, considering the poorer outcome observed in this group

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Usefulness of a hub and spoke TDM-guided expert clinical pharmacological advice program of dalbavancin for optimizing very long-term curative or suppressive treatment of chronic staphylococcal infections

    Get PDF
    : A hub and spoke model for optimizing long-term treatment of chronic staphylococcal infections with dalbavancin based on therapeutic drug monitoring (TDM)-guided expert clinical pharmacological advice (ECPA) was implemented. This multicentric retrospective cohort study included patients receiving dalbavancin monotherapy lasting &gt;6 weeks at different spoke hospitals having treatment optimized by means of a TDM-guided ECPA program at a hub hospital. Optimal pharmacokinetic/pharmacodynamic target against staphylococci with an MIC up to 0.125 mg/L was defined as dalbavancin concentrations &gt;8.04 mg/L. Patients received dalbavancin therapy for curative (curative group) or suppressive (suppressive group) purposes. Clinical outcome was assessed by means of repeated ambulatory visits. A total of 12 spoke hospitals applied for 414 TDM-based ECPA for 101 patients, of whom 64.4% (65/101) were treated for curative and 35.6% (36/101) were for suppressive purposes. In the curative and suppressive groups, TDM-based ECPA optimized treatment for up to 14 and 28 months, respectively, and ensured median optimal exposure of 95.7% and 100%, respectively. In the curative group, having &lt;70% of treatment time with concentrations above the optimal target increased failure risk [odds ratio (OR), 6.71; confidence interval (CI), 0.97-43.3; P = 0.05]. In the suppressive group, infective endocarditis was associated with an increased risk of ineffective treatment (OR, 8.65; CI, 1.29-57.62; P = 0.046). Mild adverse events were reported in 4.5% (5/101) of cases. A hub and spoke TDM-guided ECPA program of dalbavancin may be cost-effective for optimizing long-term treatment of chronic staphylococcal infections and for patients admitted to hospitals lacking in-house MD clinical pharmacologists

    Enhanced immunological recovery with early start of antiretroviral therapy during acute or early HIV infection–results of Italian Network of ACuTe HIV InfectiON (INACTION) retrospective study

    Get PDF
    ABSTRACT Background: Viral load peak and immune activation occur shortly after exposure during acute or early HIV infection (AEHI). We aimed to define the benefit of early start of antiretroviral treatment (ART) during AEHI in terms of immunological recovery, virological suppression, and treatment discontinuation. Setting: Patients diagnosed with AEHI (Fiebig stages I-V) during 2008-2014 from an analysis of 20 Italian centers. Methods: This was an observational, retrospective, and multicenter study. We investigated the ef- fect of early ART (defined as initiation within 3 months from AEHI diagnosis) on time to virolog- ical suppression, optimal immunological recovery (defined as CD4 count ≥ 500/μL, CD4 ≥ 30%, and CD4/CD8 ≥ 1), and first-line ART regimen discontinuation by Cox regression analysis. Results: There were 321 patients with AEHI included in the study (82.9% in Fiebig stage III-V). At diagnosis, the median viral load was 5.67 log10 copies/mL and the median CD4 count was 456 cells/μL. Overall, 70.6% of patients started early ART (median time from HIV diagnosis to ART initiation 12 days, IQR 6-27). Higher baseline viral load and AEHI diagnosis during 2012-2014 were independently associated with early ART. HBV co-infection, baseline CD4/CD8 ≥ 1, lower baseline HIV-RNA, and AEHI diagnosis in recent years (2012-2014) were independently associ- ated with a shorter time to virological suppression. Early ART emerged as an independent predic- tor of optimal immunological recovery after adjustment for baseline CD4 (absolute and percent- age count) and CD4/CD8 ratio. The only independent predictor of first-line ART discontinuation was an initial ART regimen including &gt; 3 drugs. Conclusions: In a large cohort of well-characterized patients with AEHI, we confirmed the ben- eficial role of early ART on CD4+ T-cell recovery and on rates of CD4/CD8 ratio normalization. Moreover, we recognized baseline CD4/CD8 ratio as an independent factor influencing time to virological response in the setting of AEHI, thus giving new insights into research of immunolog- ical markers associated with virological control
    corecore