923 research outputs found
Improvement and evaluation of simulated global biogenic soil NO emissions in an AC-GCM
Biogenic NO emissions from soils (SNOx) play important direct and indirect roles in tropospheric chemistry. The most widely applied algorithm to calculate SNOx in global models was published 15 years ago by Yienger and Levy (1995), and was based on very few measurements. Since then, numerous new measurements have been published, which we used to build up a compilation of world wide field measurements covering the period from 1978 to 2010. Recently, several satellite-based top-down approaches, which recalculated the different sources of NOx (fossil fuel, biomass burning, soil and lightning), have shown an underestimation of SNOx by the algorithm of Yienger and Levy (1995). Nevertheless, to our knowledge no general improvements of this algorithm, besides suggested scalings of the total source magnitude, have yet been published. Here we present major improvements to the algorithm, which should help to optimize the representation of SNOx in atmospheric-chemistry global climate models, without modifying the underlying principals or mathematical equations. The changes include: (1) using a new landcover map, with twice the number of landcover classes, and using annually varying fertilizer application rates; (2) adopting a fraction of 1.0 % for the applied fertilizer lost as NO, based on our compilation of measurements; (3) using the volumetric soil moisture to distinguish between the wet and dry states; and (4) adjusting the emission factors to reproduce the measured emissions in our compilation (based on either their geometric or arithmetic mean values). These steps lead to increased global annual SNOx, and our total above canopy SNOx source of 8.6 Tg yrâ1 (using the geometric mean) ends up being close to one of the satellite-based top-down approaches (8.9 Tg yrâ1). The above canopy SNOx source using the arithmetic mean is 27.6 Tg yrâ1, which is higher than all previous estimates, but compares better with a regional top-down study in eastern China. This suggests that both top-down and bottom-up approaches will be needed in future attempts to provide a better calculation of SNOx
Das Kompetenzteam Forschungsdaten an der JGU â Ein kooperatives Angebot
Um den Wissenschaftlerinnen und Wissenschaftlern an der Johannes Gutenberg-UniversitĂ€t Mainz (JGU), vielfĂ€ltige Services zum Forschungsdatenmanagement (FDM) anbieten zu können, wurde im Sommer 2018 das Kompetenzteam Forschungsdaten gegrĂŒndet. Hierzu wurden die Services und Kompetenzen verschiedener Einrichtungen der JGU, der Stabsstelle Forschung und Technologietransfer (FT), der UniversitĂ€tsbibliothek (UB), dem Zentrum fĂŒr Datenverarbeitung (ZDV), sowie dem Zentrum fĂŒr DigitalitĂ€t in den Geistes- und Kulturwissenschaften (mainzed) zusammengefĂŒhrt. Insgesamt fĂŒnf Mitarbeiter/-innen der genannten Einrichtungen sind derzeit mit verschiedenen Schwerpunkten und in unterschiedlichem Umfang im Kompetenzteam Forschungsdaten aktiv.
Beratungsservice/-workflow: Als optimale Erstanlaufstelle fĂŒr die Wissenschaftler/-innen in Bezug auf das FDM an der JGU wurde die Stabsstelle FT identifiziert, da diese dort UnterstĂŒtzung zu ihren DrittmittelantrĂ€gen suchen und somit bereits vor Projektbeginn FDM beraten werden können. Die Stabsstelle bietet hierbei z.B. Beratung zu den Grundlagen des FDM, organisatorischen Fragen sowie zu den Anforderungen der Drittmittelgeber zum FDM und Open Data. Ausgehend von FT erfolgt dann, je nach Bedarf und Thema, eine spezialisierte UnterstĂŒtzung durch die Partner im Kompetenzteam z.B. am ZDV, an der UB und dem mainzed. Durch die enge Kooperation im Team kann so eine umfassende Expertise zum FDM angeboten werden.
Schulungsangebote: Grundlagenschulungen zum FDM werden sowohl ĂŒber hochschulinterne Weiterbildungsprogramme (Personalfortbildung, Allgemeines Promotionskolleg, Fortbildungsprogramm des ZDV) als auch auf Anfrage, z.B. fĂŒr Graduiertenkollegs oder Sonderforschungsbereiche angeboten. Spezialisierte Schulungen, wie zu digitalen Methoden in den Geisteswissenschaften oder zu Metadaten werden ausschlieĂlich auf Anfrage durchgefĂŒhrt, da so besser auf die spezifischen BedĂŒrfnisse einzelner Gruppen eingegangen werden kann.
Technische Infrastruktur zur Publikation und Archivierung von Forschungsdaten: An der JGU befindet sich die technische Infrastruktur zum FDM derzeit im Aufbau. Das bestehende Publikationsrepositorium der UB wird basierend auf D-Space zu einem Repositorium fĂŒr Forschungsdaten erweitert. Ziel ist die Veröffentlichung und dauerhafte Bereitstellung von zitierfĂ€higen, frei nachnutzbaren Forschungsdaten sowie deren Verlinkung mit Publikationen und die Distribution der Metadaten in ĂŒbergreifende Nachweissysteme. Im ZDV wird ein auf iRODs basierendes Forschungsdatenarchiv fĂŒr die Wissenschaftler/-innen der JGU entwickelt, das groĂe Datenmengen dauerhaft speichern soll und direkt in die Arbeitsworkflows innerhalb der Projekte eingebunden werden kann. Daten können aus iRODS ĂŒber eine API exportiert und ĂŒber das Repositorium der UB veröffentlicht werden
Recommended from our members
Benchmarking carbon fluxes of the ISIMIP2a biome models
The purpose of this study is to evaluate the eight ISIMIP2a biome models against independent estimates of long-term net carbon fluxes (i.e. Net Biome Productivity, NBP) over terrestrial ecosystems for the recent four decades (1971â2010). We evaluate modeled global NBP against 1) the updated global residual land sink (RLS) plus land use emissions (E LUC) from the Global Carbon Project (GCP), presented as R + L in this study by Le QuĂ©rĂ© et al (2015), and 2) the land CO2 fluxes from two atmospheric inversion systems: Jena CarboScope s81_v3.8 and CAMS v15r2, referred to as F Jena and F CAMS respectively. The model ensemble-mean NBP (that includes seven models with land-use change) is higher than but within the uncertainty of R + L, while the simulated positive NBP trend over the last 30 yr is lower than that from R + L and from the two inversion systems. ISIMIP2a biome models well capture the interannual variation of global net terrestrial ecosystem carbon fluxes. Tropical NBP represents 31 ± 17% of global total NBP during the past decades, and the year-to-year variation of tropical NBP contributes most of the interannual variation of global NBP. According to the models, increasing Net Primary Productivity (NPP) was the main cause for the generally increasing NBP. Significant global NBP anomalies from the long-term mean between the two phases of El Niño Southern Oscillation (ENSO) events are simulated by all models (p < 0.05), which is consistent with the R + L estimate (p = 0.06), also mainly attributed to NPP anomalies, rather than to changes in heterotrophic respiration (Rh). The global NPP and NBP anomalies during ENSO events are dominated by their anomalies in tropical regions impacted by tropical climate variability. Multiple regressions between R + L, F Jena and F CAMS interannual variations and tropical climate variations reveal a significant negative response of global net terrestrial ecosystem carbon fluxes to tropical mean annual temperature variation, and a non-significant response to tropical annual precipitation variation. According to the models, tropical precipitation is a more important driver, suggesting that some models do not capture the roles of precipitation and temperature changes adequately
Tree mortality submodels drive simulated long-term forest dynamics: assessing 15 models from the stand to global scale
Models are pivotal for assessing future forest dynamics under the impacts of changing climate and management practices, incorporating representations of tree growth, mortality, and regeneration. Quantitative studies on the importance of mortality submodels are scarce. We evaluated 15 dynamic vegetation models (DVMs) regarding their sensitivity to different formulations of tree mortality under different degrees of climate change. The set of models comprised eight DVMs at the stand scale, three at the landscape scale, and four typically applied at the continental to global scale. Some incorporate empirically derived mortality models, and others are based on experimental data, whereas still others are based on theoretical reasoning. Each DVM was run with at least two alternative mortality submodels. Model behavior was evaluated against empirical time series data, and then, the models were subjected to different scenarios of climate change. Most DVMs matched empirical data quite well, irrespective of the mortality submodel that was used. However, mortality submodels that performed in a very similar manner against past data often led to sharply different trajectories of forest dynamics under future climate change. Most DVMs featured high sensitivity to the mortality submodel, with deviations of basal area and stem numbers on the order of 10â40% per century under current climate and 20â170% under climate change. The sensitivity of a given DVM to scenarios of climate change, however, was typically lower by a factor of two to three. We conclude that (1) mortality is one of the most uncertain processes when it comes to assessing forest response to climate change, and (2) more data and a better process understanding of tree mortality are needed to improve the robustness of simulated future forest dynamics. Our study highlights that comparing several alternative mortality formulations in DVMs provides valuable insights into the effects of process uncertainties on simulated future forest dynamics
Recommended from our members
Forest responses to lastâmillennium hydroclimate variability are governed by spatial variations in ecosystem sensitivity
Forecasts of future forest change are governed by ecosystem sensitivity to climate change, but ecosystem model projections are underâconstrained by data at multidecadal and longer timescales. Here, we quantify ecosystem sensitivity to centennialâscale hydroclimate variability, by comparing dendroclimatic and pollenâinferred reconstructions of drought, forest composition and biomass for the last millennium with five ecosystem model simulations. In both observations and models, spatial patterns in ecosystem responses to hydroclimate variability are strongly governed by ecosystem sensitivity rather than climate exposure. Ecosystem sensitivity was higher in models than observations and highest in simpler models. Modelâdata comparisons suggest that interactions among biodiversity, demography and ecophysiology processes dampen the sensitivity of forest composition and biomass to climate variability and change. Integrating ecosystem models with observations from timescales extending beyond the instrumental record can better understand and forecast the mechanisms regulating forest sensitivity to climate variability in a complex and changing world
Les droits disciplinaires des fonctions publiques : « unification », « harmonisation » ou « distanciation ». A propos de la loi du 26 avril 2016 relative à la déontologie et aux droits et obligations des fonctionnaires
The production of tt⟠, W+bb⟠and W+cc⟠is studied in the forward region of protonâproton collisions collected at a centre-of-mass energy of 8 TeV by the LHCb experiment, corresponding to an integrated luminosity of 1.98±0.02 fbâ1 . The W bosons are reconstructed in the decays WââÎœ , where â denotes muon or electron, while the b and c quarks are reconstructed as jets. All measured cross-sections are in agreement with next-to-leading-order Standard Model predictions.The production of , and is studied in the forward region of proton-proton collisions collected at a centre-of-mass energy of 8 TeV by the LHCb experiment, corresponding to an integrated luminosity of 1.98 0.02 \mbox{fb}^{-1}. The bosons are reconstructed in the decays , where denotes muon or electron, while the and quarks are reconstructed as jets. All measured cross-sections are in agreement with next-to-leading-order Standard Model predictions
Multidifferential study of identified charged hadron distributions in -tagged jets in proton-proton collisions at 13 TeV
Jet fragmentation functions are measured for the first time in proton-proton
collisions for charged pions, kaons, and protons within jets recoiling against
a boson. The charged-hadron distributions are studied longitudinally and
transversely to the jet direction for jets with transverse momentum 20 GeV and in the pseudorapidity range . The
data sample was collected with the LHCb experiment at a center-of-mass energy
of 13 TeV, corresponding to an integrated luminosity of 1.64 fb. Triple
differential distributions as a function of the hadron longitudinal momentum
fraction, hadron transverse momentum, and jet transverse momentum are also
measured for the first time. This helps constrain transverse-momentum-dependent
fragmentation functions. Differences in the shapes and magnitudes of the
measured distributions for the different hadron species provide insights into
the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any
supplementary material and additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb
public pages
Study of the decay
The decay is studied
in proton-proton collisions at a center-of-mass energy of TeV
using data corresponding to an integrated luminosity of 5
collected by the LHCb experiment. In the system, the
state observed at the BaBar and Belle experiments is
resolved into two narrower states, and ,
whose masses and widths are measured to be where the first uncertainties are statistical and the second
systematic. The results are consistent with a previous LHCb measurement using a
prompt sample. Evidence of a new
state is found with a local significance of , whose mass and width
are measured to be and , respectively. In addition, evidence of a new decay mode
is found with a significance of
. The relative branching fraction of with respect to the
decay is measured to be , where the first
uncertainty is statistical, the second systematic and the third originates from
the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb
public pages
Measurement of the ratios of branching fractions and
The ratios of branching fractions
and are measured, assuming isospin symmetry, using a
sample of proton-proton collision data corresponding to 3.0 fb of
integrated luminosity recorded by the LHCb experiment during 2011 and 2012. The
tau lepton is identified in the decay mode
. The measured values are
and
, where the first uncertainty is
statistical and the second is systematic. The correlation between these
measurements is . Results are consistent with the current average
of these quantities and are at a combined 1.9 standard deviations from the
predictions based on lepton flavor universality in the Standard Model.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-039.html (LHCb
public pages
Observation of the B0 â Ï0Ï0 decay from an amplitude analysis of B0 â (Ï+Ïâ)(Ï+Ïâ) decays
Protonâproton collision data recorded in 2011 and 2012 by the LHCb experiment, corresponding to an integrated luminosity of 3.0 fbâ1 , are analysed to search for the charmless B0âÏ0Ï0 decay. More than 600 B0â(Ï+Ïâ)(Ï+Ïâ) signal decays are selected and used to perform an amplitude analysis, under the assumption of no CP violation in the decay, from which the B0âÏ0Ï0 decay is observed for the first time with 7.1 standard deviations significance. The fraction of B0âÏ0Ï0 decays yielding a longitudinally polarised final state is measured to be fL=0.745â0.058+0.048(stat)±0.034(syst) . The B0âÏ0Ï0 branching fraction, using the B0âÏKâ(892)0 decay as reference, is also reported as B(B0âÏ0Ï0)=(0.94±0.17(stat)±0.09(syst)±0.06(BF))Ă10â6
- âŠ