64 research outputs found

    Factors related to voluntary dry matter intake by beef cattle fed primarily corn silage

    Get PDF
    The objectives of this study were to determine the influence of ration characteristics and body characteristics on voluntary feed intake of beef heifers and to determine the influence of voluntary rd^tion intake on animal gains and feed efficiency when corn silage was fed as the primary ration constituent. Corn silage ad libitum and 2.72 kg of various concentrates per animal per day were fed to 432 beef heifers, initially weighing 204-227 kg, during three one-year periods,, The experiments were from 110 to 140 days in length and each experiment was divided into four periods of approximately one month each. Body weights and sonaray measurements of fat thickness were determined initially and at monthly intervals throughout the experiments These measurements provided the basis of the body characteristic variables (percent fat and percent lean) and for the calculation of average daily gain (ADG) for each experimental period. Daily amounts of feed offered, refused and con-sumed by each pen of animals provided the basis of the voluntary intake variables and for the feed efficiency variables for each experimental period. Factors known or suspected to affect voluntary intake (VI) were used, and their effects on voluntary intake were determined using simple correlation coefficients with period measurements used as repeat observations. The independent variables, percent dry matter (DM) from silage, percent digestible energy (DE) from silage, mid-period weight and elapsed days, were utilized to establish useful multiple regression equations to predict voluntary feed intake. Simple correlation coefficients between various voluntary intake measurements and other factors known or suspected to influence either ADG or feed efficiency (DM intake, kg per body weight gain, kg) were calculated. Multiple regression equations were also developed to determine the influence of voluntary intake and other variables (percent DM from silage, percent DE from silage, voluntary DM intake per day, voluntary DE intake per day, elapsed days and mid-period weight) on animal per-formance (ADG and feed efficiency) with period measurements used as repeat observations. Voluntary intake of dry matter (VI-DM) and of digestible energy (VI-DE) was highly correlated with either variable which characterizes the ration, namely percent of either DM or DE from silage (r = approxi-mately 0o8). Therefore, VI increased as the proportion of silage in the ration increased. The other expressions of VI (VI per body weight, VI per metabolic size and VI above maintenance) were also positively correlated with percentage of either DM or DE contributed by silage. There were highly positive correlation coefficients between VI and body weight. When VI was expressed per body weight, it was negatively correlated with body weight. Expressing VI per metabolic body size reduced the influence of body weight to approximately 7% of the total influence of body weight., ADG was positively correlated with VI in some experimental periods only. This was due to lower than expected gains in one period while VI was as high as expected. Feed efficiency (higher numbers represent lower efficiency) was positively correlated with VI, VI per body weight, VI per metabolic size and VI-DE above maintenance. There was a high positive correlation between estimated fat percentage of the carcass and VI—DM and VI—DE. Almost all coefficients within periods were also positive (VI-DM, period 1-4: .60, .62, .31, and .46; VI-DE, periods 1-4: .59, .64, .20, and -.05). The fatter animals consumed more DM and more DE even within a period probably because they were also the heavier animals. The more meaningful prediction equations for VI were as follows: VI-DM, kg/day = -1.73 + 0.0806 (%DM from silage) + 0.00902 (mid-period weight, kg) - 0.00274 (elapsed days); VI-DI, kcal/day = -1321 + 327 (%DE from silage) + 3.04 (mid-period weight, kg) + 24.1 (elapsed days). The more meaningful prediction equations for animal performance were as follows: ADG, kg = 0.756 + 0.0167 (%DE from silage) - 0.0000530 (VI-DE, kcal/day) - 0.00278 (elapsed days) + 0.00136 (mid-period weight, kg); DM efficiency = -1,06 + 0.0608 (%DM from silage) + 1.01 (VI-DM, kg/day) + 0.0338 (elapsed days) - 0.0130 (weight, kg)

    TEF, Vol. 5 No. 1

    Get PDF
    The fifth issue of the annually published literary magazine TEF.https://scholarworks.sfasu.edu/tef/1004/thumbnail.jp

    Route of drug administration in out-of-hospital cardiac arrest: A protocol for a randomised controlled trial (PARAMEDIC-3)

    Get PDF
    © 2023 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/ licenses/by-nc-nd/4.0/).AIMS: The PARAMEDIC-3 trial evaluates the clinical and cost-effectiveness of an intraosseous first strategy, compared with an intravenous first strategy, for drug administration in adults who have sustained an out-of-hospital cardiac arrest. METHODS: PARAMEDIC-3 is a pragmatic, allocation concealed, open-label, multi-centre, superiority randomised controlled trial. It will recruit 15,000 patients across English and Welsh ambulance services. Adults who have sustained an out-of-hospital cardiac arrest are individually randomised to an intraosseous access first strategy or intravenous access first strategy in a 1:1 ratio through an opaque, sealed envelope system. The randomised allocation determines the route used for the first two attempts at vascular access. Participants are initially enrolled under a deferred consent model.The primary clinical-effectiveness outcome is survival at 30-days. Secondary outcomes include return of spontaneous circulation, neurological functional outcome, and health-related quality of life. Participants are followed-up to six-months following cardiac arrest. The primary health economic outcome is incremental cost per quality-adjusted life year gained. CONCLUSION: The PARAMEDIC-3 trial will provide key information on the clinical and cost-effectiveness of drug route in out-of-hospital cardiac arrest.Trial registration: ISRCTN14223494, registered 16/08/2021, prospectively registered.Peer reviewe

    Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis

    Get PDF
    BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London

    Herbicide-Resistant Crops: Utilities and Limitations for Herbicide-Resistant Weed Management

    Get PDF
    Since 1996, genetically modified herbicide-resistant (HR) crops, particularly glyphosate-resistant (GR) crops, have transformed the tactics that corn, soybean, and cotton growers use to manage weeds. The use of GR crops continues to grow, but weeds are adapting to the common practice of using only glyphosate to control weeds. Growers using only a single mode of action to manage weeds need to change to a more diverse array of herbicidal, mechanical, and cultural practices to maintain the effectiveness of glyphosate. Unfortunately, the introduction of GR crops and the high initial efficacy of glyphosate often lead to a decline in the use of other herbicide options and less investment by industry to discover new herbicide active ingredients. With some exceptions, most growers can still manage their weed problems with currently available selective and HR crop-enabled herbicides. However, current crop management systems are in jeopardy given the pace at which weed populations are evolving glyphosate resistance. New HR crop technologies will expand the utility of currently available herbicides and enable new interim solutions for growers to manage HR weeds, but will not replace the long-term need to diversify weed management tactics and discover herbicides with new modes of action. This paper reviews the strengths and weaknesses of anticipated weed management options and the best management practices that growers need to implement in HR crops to maximize the long-term benefits of current technologies and reduce weed shifts to difficult-to-control and HR weeds

    Global surveillance of cancer survival 1995-2009: analysis of individual data for 25,676,887 patients from 279 population-based registries in 67 countries (CONCORD-2)

    Get PDF
    BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control. METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25·7 million adults (age 15-99 years) and 75,000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights. FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease. INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001). Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries

    Get PDF
    Background: Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures.Methods: This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge.Results: The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P &lt; 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (beta coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not.Conclusion: Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely
    corecore