157 research outputs found

    Simulated and Applied Precision Feeding System of High and Low Forage Diets with Different Fat Sources and Sequences of Dietary Fat Concentration in In-Vitro and In-Vivo Studies

    Get PDF
    Controlling dry matter intake (DMI) is one strategy to meet the animal’s requirements while reducing feed costs and increasing feed efficiency. Controlling intake through precision-feeding provides a more nutrient-dense diet, allowing an increase in energy and nutrient utilization efficiency while decreasing nutrient loss. The literature about precision feeding has provided information regarding optimal N intake and different dietary fiber proportions, but more information needs to be addressed. This is one of the first attempts to further our knowledge through the use of fat inclusion. In the present dissertation, a total of 4 in-vitro and in-vivo experiments were conducted. Simulated and applied precision feeding with different forage to concentrate (F:C) ratios and fat sources inclusion were used to determine the effect on Holstein and Jersey dairy heifer’s digestibility and fermentation.An introduction to the importance of investigating strategies to fat supplementation in precision feeding for dairy heifers is presented in Chapter 1. Background information and justification of the current dissertation is presented in the systematic Literature Review in Chapter 2. The objective of the first experiment presented in Chapter 3 was to screen the effects of including different types of fat to different F:C ratio on digestibility and in-vitro gas production (GP). Treatments included either low forage (LF; 35%) or high forage (HF; 70%) with 2 dietary fat concentrations (6 or 9% DM) screening for 6 different fat sources plus control (CON). The CON diet had a basal fat concentration in the diet [3% fat (0% fat inclusion); and fat sources were added to attain 6% or 9% fat and consisted of Coconut oil, CO; Poultry fat, PF; Palm oil, PO; Palm kernel oil, PKO; Ca Salts, MEG; Soybean oil, SOY]. Modules were randomly assigned to treatments in a 2×2×7 factorial design and incubated for four 24 h runs. The CO-fed module had the highest DM apparent digestibility (AD), followed by SOY and PF. The true DM digestibility (IVTDMD) and OM AD were the highest in CO than the other types of fat. The AD for DM, OM, NDF, ADF, and IVTDMD was higher in LF. Total VFA was lower in modules fed different fat types than the CON and acetate, while propionate was the lowest for the CON, which increased the A:P ratio. These results suggested that LF diets with high fat concentration can be used under a precision feeding system, and different types of fat sources may improve DM and fiber digestibility. The second experiment\u27s objective presented in Chapter 4 was to evaluate the effects of fermentation and digestion of including different fat sources when high concentrate diets with high-fat inclusion are used to simulate precision feeding in continuous culture. Four treatments were randomly assigned to 8 continuous cultures in a randomized complete block design and ran for 2 periods of 10 d. Diets included high concentrate (HC; 65%) with high fat inclusion starting with a basal level of fat as CON [3% fat (0% fat inclusion); 9% fat (6% PF; CO; SO inclusion)]. The DM, OM, NDF, ADF, and hemicellulose digestibility coefficients (dC) were higher for PF-fed fermenter, and CO followed by SO and then CON. Total VFA was higher for CON, and there was a reduction in acetate and propionate with different fat treatments. These results suggest that simulated precision feeding with HC and high fat supplementation can improve digestibility. Chapter 5 presents the third experiment to determine the effects of simulated precision feeding of different PF levels at different F:C ratios on digestibility and fermentation in continuous culture. Treatments included 2 forage combinations, low (LF; 35% forage), and high (HF; 70% forage) and 4 levels of PF starting with a basal level of fat in the diet [3% fat (0% PF); 5% fat (2% PF); 7% fat (4% PF); and 9% fat (6% PF)]. Treatments were randomly assigned to 8 fermenters in a 2×4 factorial design and ran for 4, 10 d periods. The LF-fed fermenter had higher DM, OM, N, starch, and NFC dC than HF. Nutrients digestibility increased linearly with PF inclusion. Bacterial efficiency was decreased with PF inclusion. Total VFA was higher for LF, and there was a reduction in acetate with LF. The PF inclusion had a linear increase in total VFA, a linear reduction in acetate, and a linear increase in propionate. The A:P ratio decreased linearly in both LF and HF as PF increased. These results suggest that increasing PF in precision fed LF or HF can alter rumen fermentation and improve digestibility. Finally, the last experiment\u27s objective in Chapter 6 was to evaluate the effects on nutrient digestion and rumen fermentation of including different levels of PF in precision fed Holstein and Jersey dairy heifers. Four Holstein and 4 Jersey cannulated heifers were randomly assigned to 4 treatments, included a 55% forage diet with 4 increasing PF inclusion starting with a basal concentration of fat in the diet [3% fat (0% PF); 5% fat (2% PF); 7% fat (4% PF); and 9% fat (6% PF)]. Treatments were administered according to a split-plot, 4×4 Latin square design for 4 periods of 21 d. Holstein-group had a lower DM, OM, NDF, ADF, and NFC AD than Jersey-group. The inclusion of PF did not affect AD. However, starch AD increased linearly as PF increased, whereas NFC AD decreased linearly. Manure output was higher for Holstein, and the PF inclusion showed a linear decrease in manure output. Total VFA, acetate decreased linearly as PF increased. Concurrently there was a linear increase in propionate, resulting in a linear reduction in the A:P ratio. These results suggest that Jerseys utilized nutrients more efficiently than Holsteins. Dietary PF inclusion up to 6% in the rations can further reduce DMI in precision feeding programs without compromising total-tract digestibility. Overall, these studie’s results indicate that PF can be used as a replacement for corn in precision-fed Holstein and Jersey dairy heifer diets up to 6% DM. Other fat sources with different characteristics can be utilized with relative success, but further research is needed. Incorporation of supplemental fat to controlled intake strategies such as precision-feeding can reduce feed intake for optimal growth, promising impacts on costs. Furthermore, nutrient digestibility, rumen fermentation, and animal performance can be enhanced with positive effects on environmental impact

    Haematologic Parameters In Acute Promyelocytic Leukemia Patients Treated With ALL Trans- Retinoic acid

    Get PDF
    Background: Acute Promyelocytic Leukemia (APL) is commonly associated with disseminated intravascular coagulation (DIC) and early correction of coagulopathy is of vital importance. All Trans-Retinoic Acid (ATRA) is considered to be the drug of choice in the treatment of APL.  Objective: The work was conducted to 1- Identify patients with APL who show laboratory evidence of DIC. 2- Study the serial changes in haemostatic parameters in APL patients treated with ATRA and to compare their results with those treated with conventional chemotherapy without ATRA. Subjective and methods: In this prospective study (from October 2003 to October 2005), 44 newly diagnosed, untreated APL patients were included. ATRA plus chemotherapy – treated patients were 24 while 17 patients were treated with chemotherapy other than ATRA. For each patient, a full clinical evaluation was done and hematological investigations were accomplished at time of diagnosis and repeated on day 3 and 7 of therapy. Diagnosis of DIC was based on finding a positive D- dimer test with hypofibrinogenaemia with or without pathologically prolonged (PT and/or APTT). Results: In 44 newly diagnosed, untreated APL patients studied, the age range between 6-81 years with a median of 27 years. Male to female ratio was 1.3:1. Before treatment all patients had anemia, thrombocytopenia, and elevated level of D – dimer. DIC was present in all patients at time of diagnosis. All parameters that showed abnormal level at time of diagnosis had returned to normality within one week in ATRA treated group, indicating that DIC has essentially resolved. By contrast, those parameters remained abnormal even on day 7 in the chemotherapy treated group. Indicating that DIC was on going. Conclusion: ATRA therapy in APL patients is associated with rapid improvement of coagulopathy therefore , it is justified to be used from day one of the treatment

    Investigation of enhanced double weight code in point to point access networks

    Get PDF
    © 2020 Published under licence by IOP Publishing Ltd. In this paper, an investigation and evaluation to enhanced double weight (EDW) code is performed, a new technique for code structuring and building using modified arithmetical model has been given for the code in place of employing previous technique based on Trial Inspections. Innovative design has been employed for the code into P2P networks using diverse weighted EDW code to be fitting into optical CDMA relevance applications. A new developed relation for EDW code is presented, the relation is based on studying and experimenting the effect of input transmission power with code weight, and the relation developed using numerical analysis method. This relation makes the estimation for the system input power needed more efficient. The results of the code has been explained by eye diagram and parametric illustrations from the simulated results. The result shows a magnificent performance of the code during high number of users and weight. On the other hand, the relation developed for power measurement helps to prevent power loss and consumption

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Global economic burden of unmet surgical need for appendicitis

    Get PDF
    Background: There is a substantial gap in provision of adequate surgical care in many low-and middle-income countries. This study aimed to identify the economic burden of unmet surgical need for the common condition of appendicitis. Methods: Data on the incidence of appendicitis from 170 countries and two different approaches were used to estimate numbers of patients who do not receive surgery: as a fixed proportion of the total unmet surgical need per country (approach 1); and based on country income status (approach 2). Indirect costs with current levels of access and local quality, and those if quality were at the standards of high-income countries, were estimated. A human capital approach was applied, focusing on the economic burden resulting from premature death and absenteeism. Results: Excess mortality was 4185 per 100 000 cases of appendicitis using approach 1 and 3448 per 100 000 using approach 2. The economic burden of continuing current levels of access and local quality was US 92492millionusingapproach1and92 492 million using approach 1 and 73 141 million using approach 2. The economic burden of not providing surgical care to the standards of high-income countries was 95004millionusingapproach1and95 004 million using approach 1 and 75 666 million using approach 2. The largest share of these costs resulted from premature death (97.7 per cent) and lack of access (97.0 per cent) in contrast to lack of quality. Conclusion: For a comparatively non-complex emergency condition such as appendicitis, increasing access to care should be prioritized. Although improving quality of care should not be neglected, increasing provision of care at current standards could reduce societal costs substantially

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Effects of hospital facilities on patient outcomes after cancer surgery: an international, prospective, observational study

    Get PDF
    Background Early death after cancer surgery is higher in low-income and middle-income countries (LMICs) compared with in high-income countries, yet the impact of facility characteristics on early postoperative outcomes is unknown. The aim of this study was to examine the association between hospital infrastructure, resource availability, and processes on early outcomes after cancer surgery worldwide.Methods A multimethods analysis was performed as part of the GlobalSurg 3 study-a multicentre, international, prospective cohort study of patients who had surgery for breast, colorectal, or gastric cancer. The primary outcomes were 30-day mortality and 30-day major complication rates. Potentially beneficial hospital facilities were identified by variable selection to select those associated with 30-day mortality. Adjusted outcomes were determined using generalised estimating equations to account for patient characteristics and country-income group, with population stratification by hospital.Findings Between April 1, 2018, and April 23, 2019, facility-level data were collected for 9685 patients across 238 hospitals in 66 countries (91 hospitals in 20 high-income countries; 57 hospitals in 19 upper-middle-income countries; and 90 hospitals in 27 low-income to lower-middle-income countries). The availability of five hospital facilities was inversely associated with mortality: ultrasound, CT scanner, critical care unit, opioid analgesia, and oncologist. After adjustment for case-mix and country income group, hospitals with three or fewer of these facilities (62 hospitals, 1294 patients) had higher mortality compared with those with four or five (adjusted odds ratio [OR] 3.85 [95% CI 2.58-5.75]; p&lt;0.0001), with excess mortality predominantly explained by a limited capacity to rescue following the development of major complications (63.0% vs 82.7%; OR 0.35 [0.23-0.53]; p&lt;0.0001). Across LMICs, improvements in hospital facilities would prevent one to three deaths for every 100 patients undergoing surgery for cancer.Interpretation Hospitals with higher levels of infrastructure and resources have better outcomes after cancer surgery, independent of country income. Without urgent strengthening of hospital infrastructure and resources, the reductions in cancer-associated mortality associated with improved access will not be realised

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability
    corecore