59 research outputs found

    Disposition kinetics, in vitro plasma protein binding and tissue residues of tilmicosin in healthy and experimentally (CRD) infected broiler chickens

    Get PDF
    Background: Several studies assayed the pharmacokinetics of tilmicosin in broilers at a dosage of (25mg/kg.b.wt.). The aim of this study was to investigate the pharmacokinetics and tissue residues of tilmicosin following single and repeated oral administrations (25mg/kg.b.wt.) once daily for 5 consecutive days in healthy and experimentally Mycoplasma gallisepticum and E. coli infected broilers.Methods: After oral administrations of tilmicosin (25 mg/kg.b.wt.) one ml blood was collected from the right wing vein and tissues samples for determination of tilmicosin concentrations and the disposition kinetics of it by the microbiological assay method using Bacillus subtilis (ATCC 6633) as a test organism.Results: In this study, the plasma concentration time graph was characteristic of a two-compartments open model. Following a single oral administration, tilmicosin was rapidly absorbed in both healthy and experimentally infected broilers with an absorption half-life of (t0.5(ab)) 0.45 and 0.52h, maximum serum concentration (Cmax) was 1.06 and 0.69μg/ml at (tmax) about 2.56 and 2.81h, (t0.5(el)) was 21.86 and 22.91h and (MRT) was 32.15 and 33.71h, respectively; indicating the slow elimination of tilmicosin in chickens. The in-vitro protein binding was 9.72±0.83%. Serum concentrations of tilmicosin following repeated oral administration once daily for five consecutive days, almost peaked 2h after each dose with lower significant values recorded in experimentally infected broiler chickens than in healthy ones.Conclusions: This study showed that tilmicosin was cleared rapidly from tissues. The highest residue values were recorded in the lung followed by liver and kidneys while the lowest values were recorded in spleen, fat and thigh muscles. Five days for withdrawal period of tilmicosin suggested in broilers

    Prevalence of Coronary Artery Lesion(S) in Pati ents Aged 40-50 Years Undergoing Rheumatic Valvular Surgery

    Get PDF
    Objective: To determine the prevalence of coronary artery disease in patients undergoing valve surgery for rheumatic heart disease between age 40-50 years, usefulness, and indication of pre-operative coronary angiography. Methods: This is an observational prospective study that took place in 2 hospitals (National Heart Institute and Nasser Institute) within the period starting from January 2013 to January 2015. We included 454 rheumatic patients that were admitted for elective primary mitral, aortic or double valve surgery, and that had a coronary angiogram in their regular preoperative workup. All coronary angiographies were performed by injecting right and left coronaries by using 80-100 ml of iodinated contrast to obtain the standard views of both right and left coronaries using Philips or Siemens machines in both hospitals. Coronary artery disease (CAD) of 50% is considered to be a positive finding. Results: There was no correlation between rheumatic heart disease in this age group and CAD as only 1.76% had the significant stenosis. Male gender, family history of CAD, age above 45yrs, hypertension, and smoking showed significant correlation with the CAD in this study. Conclusion: Our results suggest that the overall prevalence of coronary artery disease in patients undergoing rheumatic valve surgery in our population is not comparable with the prevalence reported in international data. So, multicenter studies are needed in developing countries to set their own guidelines. Therefore, our study can be the nucleus for these guidelines in our country

    Comparative pharmacokinetics of cefoperazone following intravenous and intramuscular administration in goats

    Get PDF
    AbstractThe pharmacokinetic profile of cefoperazone was studied in goats following intravenous and intramuscular administration of 20mg/kg body weight. Cefoperazone concentrations in serum were determined by microbiological assay technique using Escherichia coli (ATCC 10536) as test organism. Following i.v. administration, the cefoperazone serum concentration–time curve was best fitted in a two compartment open model. Cefoperazone has moderate distribution in the body of goats with Vdss of 0.44±0.03L/kg. The elimination half-life (T0.5(β)), area under curve (AUC) and total body clearance (Cltot) were 1.97±0.14h, 149.63±8.61μgml−1h−1, and 2.17ml/min/kg, respectively. Following i.m. administration, the drug was very rapidly absorbed, with an absorption half-life (T0.5(ab)) of 0.12±0.01h. The maximum serum concentration (Cmax) of 30.42±3.53μgml−1 was attained at (Tmax) 0.58±0.02h, with an elimination half-life (T0.5(el)) of 2.53±0.11h. The systemic bioavailability of cefoperazone in the goats after i.m. administration was 83.62% and in vitro protein binding was 20.34%. The serum concentrations of cefoperazone along 12h post i.m. injection in this study were exceeding the MIC of different susceptible micro-organisms responsible for serious disease problems. Consequently, a suitable intramuscular dosage regimen for cefoperazone was 20mg/kg repeated at 12h intervals in goats. The drug was detected in urine up to 12 and 18h following i.v. and i.m. administration, respectively

    Clinical outcomes of transcatheter aortic valve replacement stratified by left ventricular ejection fraction : a single centre pilot study

    Get PDF
    Introduction: To define baseline echocardiographic, electrocardiographic (ECG) and computed tomographic (CT) findings of patients with heart failure undergoing transcatheter aortic valve replacement (TAVR) and analyze their overall procedural outcomes. Methods: Between 2018 and 2021, patients with severe aortic stenosis (AS) who performed transcatheter aortic valve replacement (TAVR) in Sabah Al Ahmad Cardiac Centre, Al Amiri Hospital were identified. A retrospective review of patients' parameters including pre-, intra-, and post-procedural data was conducted. Patients were grouped in 2 subgroups according to their EF: EF <40% (HFrEF) and EF ≥ 40%. The data included patients’ baseline characteristics, electrocardiographic and echocardiographic details along with pre-procedural CT assessment of aortic valve dimensions. Primary outcomes including post-operative disturbances, pacemaker implantation and in-hospital mortality following TAVR were additionally analyzed. Results: A total of 61 patients with severe AS underwent TAVR. The mean age was 73.5 ± 9, and 21 (34%) of the patients were males. The mean ejection fraction (EF) was 55.5 ± 9.7%. Of 61 patients, 12 (20%) were identified as heart failure with reduced EF (<40%). These patients were younger, more often males, and were more likely to have coronary artery disease (75% versus 53.1%). Left ventricular hypertrophy and diastolic dysfunction was documented in 75% and 58.3% of patients with heart failure with reduced ejection fraction (HFrEF) respectively. Post TAVR conduction disturbances, with the commonest being LBBB was observed in 41.7%. Permanent pacemaker was implanted in 3 of patients with HFrEF (25%). There were no significant differences between the two groups with regards to in hospital mortality (p = 0.618). Conclusion: Severe AS with EF <40% constitute a remarkable proportion of patients undergoing TAVR. Preliminary results of post-operative conduction disturbances and in hospital mortality in HFrEF patients were concluded to not differ from patients with LVEF ≥40%

    Meningococcal disease surveillance in the Asia-Pacific region (2020): The global meningococcal initiative.

    Get PDF
    The degree of surveillance data and control strategies for invasive meningococcal disease (IMD) varies across the Asia-Pacific region. IMD cases are often reported throughout the region, but the disease is not notifiable in some countries, including Myanmar, Bangladesh and Malaysia. Although there remains a paucity of data from many countries, specific nations have introduced additional surveillance measures. The incidence of IMD is low and similar across the represented countries (<0.2 cases per 100,000 persons per year), with the predominant serogroups of Neisseria meningitidis being B, W and Y, although serogroups A and X are present in some areas. Resistance to ciprofloxacin is also of concern, with the close monitoring of antibiotic-resistant clonal complexes (e.g., cc4821) being a priority. Meningococcal vaccination is only included in a few National Immunization Programs, but is recommended for high-risk groups, including travellers (such as pilgrims) and people with complement deficiencies or human immunodeficiency virus (HIV). Both polysaccharide and conjugate vaccines form part of recommendations. However, cost and misconceptions remain limiting factors in vaccine uptake, despite conjugate vaccines preventing the acquisition of carriage.S

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
    corecore