490 research outputs found

    Can we save large carnivores without losing large carnivore science?

    Get PDF
    Large carnivores are depicted to shape entire ecosystems through top-down processes. Studies describing these processes are often used to support interventionist wildlife management practices, including carnivore reintroduction or lethal control programs. Unfortunately, there is an increasing tendency to ignore, disregard or devalue fundamental principles of the scientific method when communicating the reliability of current evidence for the ecological roles that large carnivores may play, eroding public confidence in large carnivore science and scientists. Here, we discuss six interrelated issues that currently undermine the reliability of the available literature on the ecological roles of large carnivores: (1) the overall paucity of available data, (2) reliability of carnivore population sampling techniques, (3) general disregard for alternative hypotheses to top-down forcing, (4) lack of applied science studies, (5) frequent use of logical fallacies, and (6) generalisation of results from relatively pristine systems to those substantially altered by humans. We first describe how widespread these issues are, and given this, show, for example, that evidence for the roles of wolves (Canis lupus) and dingoes (Canis lupus dingo) in initiating trophic cascades is not as strong as is often claimed. Managers and policy makers should exercise caution when relying on this literature to inform wildlife management decisions. We emphasise the value of manipulative experiments, and discuss the role of scientific knowledge in the decision-making process. We hope that the issues we raise here prompt deeper consideration of actual evidence, leading towards an improvement in both the rigour and communication of large carnivore science

    The shellfish enigma across the Mesolithic-Neolithic transition in southern Scandinavia

    Get PDF
    The well-known and widespread replacement of oysters (abundant during the Mesolithic period) by cockles and mussels in many Danish Stone Age shell middens ca. 5900 cal yrs BP coincides with the transition to agriculture in southern Scandinavia. This human resource shift is commonly believed to reflect changing resource availability, driven by environmental and/or climatic change at the Mesolithic-Neolithic transition rather than cultural choice. While several hypotheses have been proposed to explain the “Mesolithic-Neolithic oyster decline”, an explanation based on a sudden freshening of the inner Danish waters has received most attention. Here, for the first time, we test and refute this long-standing hypothesis that declining salinity explains the marked reduction in oysters identified within numerous shell middens across coastal Denmark at the Mesolithic-Neolithic transition using quantitative and qualitative salinity inference from several, independent proxies (diatoms, molluscs and foraminifera) from multiple Danish fjord sites. Alternatively, we attribute the oyster decline to other environmental causes (particularly changing sedimentation), ultimately driven by external climatic forcing. Critical application of such high-quality environmental archives can reinvigorate archaeological debates and can aid in understanding and managing environmental change in increasingly impacted coastal regions

    A crossover randomised controlled trial of oral mandibular advancement devices for obstructive sleep apnoea-hypopnoea (TOMADO)

    Get PDF
    Rationale Mandibular advancement devices (MADs) are used to treat obstructive sleep apnoea-hypopnoea syndrome (OSAHS) but evidence is lacking regarding their clinical and cost-effectiveness in less severe disease. Objectives To compare clinical- and cost-effectiveness of a range of MADs against no treatment in mild to moderate OSAHS. Measurements and methods This open-label, randomised, controlled, crossover trial was undertaken at a UK sleep centre. Adults with Apnoea-Hypopnoea Index (AHI) 5–<30/h and Epworth Sleepiness Scale (ESS) score ≥9 underwent 6 weeks of treatment with three nonadjustable MADs: self-moulded (SleepPro 1; SP1); semi-bespoke (SleepPro 2; SP2); fully-bespoke MAD (bMAD); and 4 weeks no treatment. Primary outcome was AHI scored by a polysomnographer blinded to treatment. Secondary outcomes included ESS, quality of life, resource use and cost. Main results 90 patients were randomised and 83 were analysed. All devices reduced AHI compared with no treatment by 26% (95% CI 11% to 38%, p=0.001) for SP1, 33% (95% CI 24% to 41%) for SP2 and 36% (95% CI 24% to 45%, p<0.001) for bMAD. ESS was 1.51 (95% CI 0.73 to 2.29, p<0.001, SP1) to 2.37 (95% CI 1.53 to 3.22, p<0.001, bMAD) lower than no treatment (p<0.001 for all). Compliance was lower for SP1, which was the least preferred treatment at trial exit. All devices were cost-effective compared with no treatment at a £20 000/quality-adjusted life year (QALY) threshold. SP2 was the most cost-effective up to £39 800/QALY. Conclusions Non-adjustable MADs achieve clinically important improvements in mild to moderate OSAHS and are cost-effective

    Large carnivore expansion in Europe is associated with human population density and land cover changes

    Get PDF
    Aim: The recent recovery of large carnivores in Europe has been explained as resulting from a decrease in human persecution driven by widespread rural land abandonment, paralleled by forest cover increase and the consequent increase in availability of shelter and prey. We investigated whether land cover and human population density changes are related to the relative probability of occurrence of three European large carnivores: the grey wolf (Canis lupus), the Eurasian lynx (Lynx lynx) and the brown bear (Ursus arctos). Location: Europe, west of 64° longitude. Methods: We fitted multi-temporal species distribution models using &gt;50,000 occurrence points with time series of land cover, landscape configuration, protected areas, hunting regulations and human population density covering a 24-year period (1992–2015). Within the temporal window considered, we then predicted changes in habitat suitability for large carnivores throughout Europe. Results: Between 1992 and 2015, the habitat suitability for the three species increased in Eastern Europe, the Balkans, North-West Iberian Peninsula and Northern Scandinavia, but showed mixed trends in Western and Southern Europe. These trends were primarily associated with increases in forest cover and decreases in human population density, and, additionally, with decreases in the cover of mosaics of cropland and natural vegetation. Main conclusions: Recent land cover and human population changes appear to have altered the habitat suitability pattern for large carnivores in Europe, whereas protection level did not play a role. While projected changes largely match the observed recovery of large carnivore populations, we found mismatches with the recent expansion of wolves in Central and Southern Europe, where factors not included in our models may have played a dominant role. This suggests that large carnivores’ co-existence with humans in European landscapes is not limited by habitat availability, but other factors such as favourable human tolerance and policy

    Grid-Adapted FUN3D Computations for the Second High Lift Prediction Workshop

    Get PDF
    Contributions of the unstructured Reynolds-averaged Navier-Stokes code FUN3D to the 2nd AIAA CFD High Lift Prediction Workshop are described, and detailed comparisons are made with experimental data. Using workshop-supplied grids, results for the clean wing configuration are compared with results from the structured code CFL3D Using the same turbulence model, both codes compare reasonably well in terms of total forces and moments, and the maximum lift is similarly over-predicted for both codes compared to experiment. By including more representative geometry features such as slat and flap brackets and slat pressure tube bundles, FUN3D captures the general effects of the Reynolds number variation, but under-predicts maximum lift on workshop-supplied grids in comparison with the experimental data, due to excessive separation. However, when output-based, off-body grid adaptation in FUN3D is employed, results improve considerably. In particular, when the geometry includes both brackets and the pressure tube bundles, grid adaptation results in a more accurate prediction of lift near stall in comparison with the wind-tunnel data. Furthermore, a rotation-corrected turbulence model shows improved pressure predictions on the outboard span when using adapted grids

    Can forest management based on natural disturbances maintain ecological resilience?

    Get PDF
    Given the increasingly global stresses on forests, many ecologists argue that managers must maintain ecological resilience: the capacity of ecosystems to absorb disturbances without undergoing fundamental change. In this review we ask: Can the emerging paradigm of natural-disturbance-based management (NDBM) maintain ecological resilience in managed forests? Applying resilience theory requires careful articulation of the ecosystem state under consideration, the disturbances and stresses that affect the persistence of possible alternative states, and the spatial and temporal scales of management relevance. Implementing NDBM while maintaining resilience means recognizing that (i) biodiversity is important for long-term ecosystem persistence, (ii) natural disturbances play a critical role as a generator of structural and compositional heterogeneity at multiple scales, and (iii) traditional management tends to produce forests more homogeneous than those disturbed naturally and increases the likelihood of unexpected catastrophic change by constraining variation of key environmental processes. NDBM may maintain resilience if silvicultural strategies retain the structures and processes that perpetuate desired states while reducing those that enhance resilience of undesirable states. Such strategies require an understanding of harvesting impacts on slow ecosystem processes, such as seed-bank or nutrient dynamics, which in the long term can lead to ecological surprises by altering the forest's capacity to reorganize after disturbance

    Modeling factors influencing the demand for emergency department services in ontario: a comparison of methods

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Emergency departments are medical treatment facilities, designed to provide episodic care to patients suffering from acute injuries and illnesses as well as patients who are experiencing sporadic flare-ups of underlying chronic medical conditions which require immediate attention. Supply and demand for emergency department services varies across geographic regions and time. Some persons do not rely on the service at all whereas; others use the service on repeated occasions. Issues regarding increased wait times for services and crowding illustrate the need to investigate which factors are associated with increased frequency of emergency department utilization. The evidence from this study can help inform policy makers on the appropriate mix of supply and demand targeted health care policies necessary to ensure that patients receive appropriate health care delivery in an efficient and cost-effective manner. The purpose of this report is to assess those factors resulting in increased demand for emergency department services in Ontario. We assess how utilization rates vary according to the severity of patient presentation in the emergency department. We are specifically interested in the impact that access to primary care physicians has on the demand for emergency department services. Additionally, we wish to investigate these trends using a series of novel regression models for count outcomes which have yet to be employed in the domain of emergency medical research.</p> <p>Methods</p> <p>Data regarding the frequency of emergency department visits for the respondents of Canadian Community Health Survey (CCHS) during our study interval (2003-2005) are obtained from the National Ambulatory Care Reporting System (NACRS). Patients' emergency department utilizations were linked with information from the Canadian Community Health Survey (CCHS) which provides individual level medical, socio-demographic, psychological and behavioral information for investigating predictors of increased emergency department utilization. Six different multiple regression models for count data were fitted to assess the influence of predictors on demand for emergency department services, including: Poisson, Negative Binomial, Zero-Inflated Poisson, Zero-Inflated Negative Binomial, Hurdle Poisson, and Hurdle Negative Binomial. Comparison of competing models was assessed by the Vuong test statistic.</p> <p>Results</p> <p>The CCHS cycle 2.1 respondents were a roughly equal mix of males (50.4%) and females (49.6%). The majority (86.2%) were young-middle aged adults between the ages of 20-64, living in predominantly urban environments (85.9%), with mid-high household incomes (92.2%) and well-educated, receiving at least a high-school diploma (84.1%). Many participants reported no chronic disease (51.9%), fell into a small number (0-5) of ambulatory diagnostic groups (62.3%), and perceived their health status as good/excellent (88.1%); however, were projected to have high Resource Utilization Band levels of health resource utilization (68.2%). These factors were largely stable for CCHS cycle 3.1 respondents. Factors influencing demand for emergency department services varied according to the severity of triage scores at initial presentation. For example, although a non-significant predictor of the odds of emergency department utilization in high severity cases, access to a primary care physician was a statistically significant predictor of the likelihood of emergency department utilization (OR: 0.69; 95% CI OR: 0.63-0.75) and the rate of emergency department utilization (RR: 0.57; 95% CI RR: 0.50-0.66) in low severity cases.</p> <p>Conclusion</p> <p>Using a theoretically appropriate hurdle negative binomial regression model this unique study illustrates that access to a primary care physician is an important predictor of both the odds and rate of emergency department utilization in Ontario. Restructuring primary care services, with aims of increasing access to undersupplied populations may result in decreased emergency department utilization rates by approximately 43% for low severity triage level cases.</p
    • …
    corecore