82 research outputs found

    How to make complexity look simple? Conveying ecosystems restoration complexity for socio-economic research and public engagement

    Get PDF
    Ecosystems degradation represents one of the major global challenges at the present time, threating people’s livelihoods and well-being worldwide. Ecosystem restoration therefore seems no longer an option, but an imperative. Restoration challenges are such that a dialogue has begun on the need to re-shape restoration as a science. A critical aspect of that reshaping process is the acceptance that restoration science and practice needs to be coupled with socio-economic research and public engagement. This inescapably means conveying complex ecosystem’s information in a way that is accessible to the wider public. In this paper we take up this challenge with the ultimate aim of contributing to making a step change in science’s contribution to ecosystems restoration practice. Using peatlands as a paradigmatically complex ecosystem, we put in place a transdisciplinary process to articulate a description of the processes and outcomes of restoration that can be understood widely by the public. We provide evidence of the usefulness of the process and tools in addressing four key challenges relevant to restoration of any complex ecosystem: (1) how to represent restoration outcomes; (2) how to establish a restoration reference; (3) how to cope with varying restoration time-lags and (4) how to define spatial units for restoration. This evidence includes the way the process resulted in the creation of materials that are now being used by restoration practitioners for communication with the public and in other research contexts. Our main contribution is of an epistemological nature: while ecosystem services-based approaches have enhanced the integration of academic disciplines and non-specialist knowledge, this has so far only followed one direction (from the biophysical underpinning to the description of ecosystem services and their appreciation by the public). We propose that it is the mix of approaches and epistemological directions (including from the public to the biophysical parameters) what will make a definitive contribution to restoration practice

    Multiperspective analysis of erosion tolerance

    Get PDF
    Erosion tolerance is the most multidisciplinary field of soil erosion research. Scientists have shown lack in ability to adequately analyze the huge list of variables that influence soil loss tolerance definitions. For these the perspectives of erosion made by farmers, environmentalists, society and politicians have to be considered simultaneously. Partial and biased definitions of erosion tolerance may explain not only the polemic nature of the currently suggested values but also, in part, the nonadoption of the desired levels of erosion control. To move towards a solution, considerable changes would have to occur on how this topic is investigated, especially among scientists, who would have to change methods and strategies and extend the perspective of research out of the boundaries of the physical processes and the frontiers of the academy. A more effective integration and communication with the society and farmers, to learn about their perspective of erosion and a multidisciplinary approach, integrating soil, social, economic and environmental sciences are essential for improved erosion tolerance definitions. In the opinion of the authors, soil erosion research is not moving in this direction and a better understanding of erosion tolerance is not to be expected in the near future

    Description and performance of track and primary-vertex reconstruction with the CMS tracker

    Get PDF
    A description is provided of the software algorithms developed for the CMS tracker both for reconstructing charged-particle trajectories in proton-proton interactions and for using the resulting tracks to estimate the positions of the LHC luminous region and individual primary-interaction vertices. Despite the very hostile environment at the LHC, the performance obtained with these algorithms is found to be excellent. For tbar t events under typical 2011 pileup conditions, the average track-reconstruction efficiency for promptly-produced charged particles with transverse momenta of pT > 0.9GeV is 94% for pseudorapidities of |η| < 0.9 and 85% for 0.9 < |η| < 2.5. The inefficiency is caused mainly by hadrons that undergo nuclear interactions in the tracker material. For isolated muons, the corresponding efficiencies are essentially 100%. For isolated muons of pT = 100GeV emitted at |η| < 1.4, the resolutions are approximately 2.8% in pT, and respectively, 10μm and 30μm in the transverse and longitudinal impact parameters. The position resolution achieved for reconstructed primary vertices that correspond to interesting pp collisions is 10–12μm in each of the three spatial dimensions. The tracking and vertexing software is fast and flexible, and easily adaptable to other functions, such as fast tracking for the trigger, or dedicated tracking for electrons that takes into account bremsstrahlung

    Red swamp crayfish: biology, ecology and invasion - an overview

    Full text link

    Measurement of vector boson production cross sections and their ratios using pp collisions at √s = 13.6 TeV with the ATLAS detector

    Get PDF
    Abstract available from publisher's website

    Combination of searches for heavy spin-1 resonances using 139 fb−1 of proton-proton collision data at √s = 13 TeV with the ATLAS detector

    Get PDF
    A combination of searches for new heavy spin-1 resonances decaying into diferent pairings of W, Z, or Higgs bosons, as well as directly into leptons or quarks, is presented. The data sample used corresponds to 139 fb−1 of proton-proton collisions at √s = 13 TeV collected during 2015–2018 with the ATLAS detector at the CERN Large Hadron Collider. Analyses selecting quark pairs (qq, bb, tt¯, and tb) or third-generation leptons (τν and τ τ ) are included in this kind of combination for the frst time. A simplifed model predicting a spin-1 heavy vector-boson triplet is used. Cross-section limits are set at the 95% confdence level and are compared with predictions for the benchmark model. These limits are also expressed in terms of constraints on couplings of the heavy vector-boson triplet to quarks, leptons, and the Higgs boson. The complementarity of the various analyses increases the sensitivity to new physics, and the resulting constraints are stronger than those from any individual analysis considered. The data exclude a heavy vector-boson triplet with mass below 5.8 TeV in a weakly coupled scenario, below 4.4 TeV in a strongly coupled scenario, and up to 1.5 TeV in the case of production via vector-boson fusion
    corecore