113 research outputs found

    EquiFACS: the Equine Facial Action Coding System

    Get PDF
    Although previous studies of horses have investigated their facial expressions in specific contexts, e.g. pain, until now there has been no methodology available that documents all the possible facial movements of the horse and provides a way to record all potential facial configurations. This is essential for an objective description of horse facial expressions across a range of contexts that reflect different emotional states. Facial Action Coding Systems (FACS) provide a systematic methodology of identifying and coding facial expressions on the basis of underlying facial musculature and muscle movement. FACS are anatomically based and document all possible facial movements rather than a configuration of movements associated with a particular situation. Consequently, FACS can be applied as a tool for a wide range of research questions. We developed FACS for the domestic horse (Equus caballus) through anatomical investigation of the underlying musculature and subsequent analysis of naturally occurring behaviour captured on high quality video. Discrete facial movements were identified and described in terms of the underlying muscle contractions, in correspondence with previous FACS systems. The reliability of others to be able to learn this system (EquiFACS) and consistently code behavioural sequences was high—and this included people with no previous experience of horses. A wide range of facial movements were identified, including many that are also seen in primates and other domestic animals (dogs and cats). EquiFACS provides a method that can now be used to document the facial movements associated with different social contexts and thus to address questions relevant to understanding social cognition and comparative psychology, as well as informing current veterinary and animal welfare practices

    Degradation of metalaxyl and folpet by filamentous fungi isolated from Portuguese (Alentejo) vineyard soils

    Get PDF
    Degradation of xenobiotics by microbial populations is a potential method to enhance the effectiveness of ex situ or in situ bioremediation. The purpose of this study was to evaluate the impact of repeated metalaxyl and folpet treatments on soil microbial communities and to select soil fungal strains able to degrade these fungicides. Results showed enhanced degradation of metalaxyl and folpet in vineyards soils submitted to repeated treatments with these fungicides. Indeed, the greatest degradation ability was observed in vineyard soil samples submitted to greater numbers of treatments. Respiration activities, as determined in the presence of selective antibiotics in soil suspensions amended with metalaxyl and folpet, showed that the fungal population was the microbiota community most active in the degradation process. Batch cultures performed with a progressive increase of fungicide concentrations allowed the selection of five tolerant fungal strains: Penicillium sp. 1 and Penicillium sp. 2, mycelia sterila 1 and 3, and Rhizopus stolonifer. Among these strains, mycelium sterila 3 and R. stolonifer presented only in vineyard soils treated with repeated application of these fungicides and showed tolerance >1,000 mg l−1 against commercial formulations of metalaxyl (10 %) plus folpet (40 %). Using specific methods for inducing sporulation, mycelium sterila 3 was identified as Gongronella sp. Because this fungus is rare, it was compared using csM13-polymerase chain reaction (PCR) with the two known species, Gongronella butleri and G. lacrispora. The high tolerance to metalaxyl and folpet shown by Gongronella sp. and R. stolonifer might be correlated with their degradation ability. Our results point out that selected strains have potential for the bioremediation of metalaxyl and folpet in polluted soil sites

    Socioeconomic inequalities in cancer survival in Scotland 1986–2000

    Get PDF
    We analysed trends in 5-year survival of the 18 commonest cancers in Scotland diagnosed between 1986 and 2000 and followed up to 2004 in each of five deprivation groups based on patients postcode of residence at diagnosis. We estimated relative survival up to 5 years after diagnosis, adjusting for the different background mortality in each deprivation group by age, sex and calendar period. We estimated trends in overall survival and in the deprivation gap in survival up to 2004. Five-year survival improved for all malignancies except bladder cancer and was associated with a widening in the deprivation gap in survival. For 25 of 30 cancer–sex combinations examined, 5-year survival was lower among more deprived patients diagnosed during 1996–2000, and the deprivation gap in survival had widened since 1986–1990 for 15 of these 25 cancers, similar to the trends seen in England and Wales

    Restructuring of Pancreatic Islets and Insulin Secretion in a Postnatal Critical Window

    Get PDF
    Function and structure of adult pancreatic islets are determined by early postnatal development, which in rats corresponds to the first month of life. We analyzed changes in blood glucose and hormones during this stage and their association with morphological and functional changes of alpha and beta cell populations during this period. At day 20 (d20), insulin and glucose plasma levels were two- and six-fold higher, respectively, as compared to d6. Interestingly, this period is characterized by physiological hyperglycemia and hyperinsulinemia, where peripheral insulin resistance and a high plasmatic concentration of glucagon are also observed. These functional changes were paralleled by reorganization of islet structure, cell mass and aggregate size of alpha and beta cells. Cultured beta cells from d20 secreted the same amount of insulin in 15.6 mM than in 5.6 mM glucose (basal conditions), and were characterized by a high basal insulin secretion. However, beta cells from d28 were already glucose sensitive. Understanding and establishing morphophysiological relationships in the developing endocrine pancreas may explain how events in early life are important in determining adult islet physiology and metabolism

    Global and regional burden of chronic respiratory disease in 2016 arising from non-infectious airborne occupational exposures: a systematic analysis for the Global Burden of Disease Study 2016

    Get PDF
    OBJECTIVES: This paper presents detailed analysis of the global and regional burden of chronic respiratory disease arising from occupational airborne exposures, as estimated in the Global Burden of Disease 2016 study. METHODS: The burden of chronic obstructive pulmonary disease (COPD) due to occupational exposure to particulate matter, gases and fumes, and secondhand smoke, and the burden of asthma resulting from occupational exposure to asthmagens, was estimated using the population attributable fraction (PAF), calculated using exposure prevalence and relative risks from the literature. PAFs were applied to the number of deaths and disability-adjusted life years (DALYs) for COPD and asthma. Pneumoconioses were estimated directly from cause of death data. Age-standardised rates were based only on persons aged 15 years and above. RESULTS: The estimated PAFs (based on DALYs) were 17% (95% uncertainty interval (UI) 14%-20%) for COPD and 10% (95% UI 9%-11%) for asthma. There were estimated to be 519 000 (95% UI 441,000-609,000) deaths from chronic respiratory disease in 2016 due to occupational airborne risk factors (COPD: 460,100 [95% UI 382,000-551,000]; asthma: 37,600 [95% UI 28,400-47,900]; pneumoconioses: 21,500 [95% UI 17,900-25,400]. The equivalent overall burden estimate was 13.6 million (95% UI 11.9-15.5 million); DALYs (COPD: 10.7 [95% UI 9.0-12.5] million; asthma: 2.3 [95% UI 1.9-2.9] million; pneumoconioses: 0.58 [95% UI 0.46-0.67] million). Rates were highest in males; older persons and mainly in Oceania, Asia and sub-Saharan Africa; and decreased from 1990 to 2016. CONCLUSIONS: Workplace exposures resulting in COPD, asthma and pneumoconiosis continue to be important contributors to the burden of disease in all regions of the world. This should be reducible through improved prevention and control of relevant exposures

    Health sector spending and spending on HIV/AIDS, tuberculosis, and malaria, and development assistance for health: progress towards Sustainable Development Goal 3

    Get PDF
    Background: Sustainable Development Goal (SDG) 3 aims to “ensure healthy lives and promote well-being for all at all ages”. While a substantial effort has been made to quantify progress towards SDG3, less research has focused on tracking spending towards this goal. We used spending estimates to measure progress in financing the priority areas of SDG3, examine the association between outcomes and financing, and identify where resource gains are most needed to achieve the SDG3 indicators for which data are available. Methods: We estimated domestic health spending, disaggregated by source (government, out-of-pocket, and prepaid private) from 1995 to 2017 for 195 countries and territories. For disease-specific health spending, we estimated spending for HIV/AIDS and tuberculosis for 135 low-income and middle-income countries, and malaria in 106 malaria-endemic countries, from 2000 to 2017. We also estimated development assistance for health (DAH) from 1990 to 2019, by source, disbursing development agency, recipient, and health focus area, including DAH for pandemic preparedness. Finally, we estimated future health spending for 195 countries and territories from 2018 until 2030. We report all spending estimates in inflation-adjusted 2019 US,unlessotherwisestated.Findings:SincethedevelopmentandimplementationoftheSDGsin2015,globalhealthspendinghasincreased,reaching, unless otherwise stated. Findings: Since the development and implementation of the SDGs in 2015, global health spending has increased, reaching 7·9 trillion (95% uncertainty interval 7·8–8·0) in 2017 and is expected to increase to 110trillion(107112)by2030.In2017,inlowincomeandmiddleincomecountriesspendingonHIV/AIDSwas11·0 trillion (10·7–11·2) by 2030. In 2017, in low-income and middle-income countries spending on HIV/AIDS was 20·2 billion (17·0–25·0) and on tuberculosis it was 109billion(103118),andinmalariaendemiccountriesspendingonmalariawas10·9 billion (10·3–11·8), and in malaria-endemic countries spending on malaria was 5·1 billion (4·9–5·4). Development assistance for health was 406billionin2019andHIV/AIDShasbeenthehealthfocusareatoreceivethehighestcontributionsince2004.In2019,40·6 billion in 2019 and HIV/AIDS has been the health focus area to receive the highest contribution since 2004. In 2019, 374 million of DAH was provided for pandemic preparedness, less than 1% of DAH. Although spending has increased across HIV/AIDS, tuberculosis, and malaria since 2015, spending has not increased in all countries, and outcomes in terms of prevalence, incidence, and per-capita spending have been mixed. The proportion of health spending from pooled sources is expected to increase from 81·6% (81·6–81·7) in 2015 to 83·1% (82·8–83·3) in 2030. Interpretation: Health spending on SDG3 priority areas has increased, but not in all countries, and progress towards meeting the SDG3 targets has been mixed and has varied by country and by target. The evidence on the scale-up of spending and improvements in health outcomes suggest a nuanced relationship, such that increases in spending do not always results in improvements in outcomes. Although countries will probably need more resources to achieve SDG3, other constraints in the broader health system such as inefficient allocation of resources across interventions and populations, weak governance systems, human resource shortages, and drug shortages, will also need to be addressed. Funding: The Bill & Melinda Gates Foundatio

    Estimating global injuries morbidity and mortality: methods and data used in the Global Burden of Disease 2017 study

    Get PDF
    BACKGROUND: While there is a long history of measuring death and disability from injuries, modern research methods must account for the wide spectrum of disability that can occur in an injury, and must provide estimates with sufficient demographic, geographical and temporal detail to be useful for policy makers. The Global Burden of Disease (GBD) 2017 study used methods to provide highly detailed estimates of global injury burden that meet these criteria. METHODS: In this study, we report and discuss the methods used in GBD 2017 for injury morbidity and mortality burden estimation. In summary, these methods included estimating cause-specific mortality for every cause of injury, and then estimating incidence for every cause of injury. Non-fatal disability for each cause is then calculated based on the probabilities of suffering from different types of bodily injury experienced. RESULTS: GBD 2017 produced morbidity and mortality estimates for 38 causes of injury. Estimates were produced in terms of incidence, prevalence, years lived with disability, cause-specific mortality, years of life lost and disability-adjusted life-years for a 28-year period for 22 age groups, 195 countries and both sexes. CONCLUSIONS: GBD 2017 demonstrated a complex and sophisticated series of analytical steps using the largest known database of morbidity and mortality data on injuries. GBD 2017 results should be used to help inform injury prevention policy making and resource allocation. We also identify important avenues for improving injury burden estimation in the future

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability
    corecore