27 research outputs found

    Automation of NLO QCD and EW corrections with Sherpa and Recola

    Get PDF
    This publication presents the combination of the one-loop matrix-element generator Recola with the multipurpose Monte Carlo program Sherpa. Since both programs are highly automated, the resulting Sherpa+Recola framework allows for the computation of -in principle- any Standard Model process at both NLO QCD and EW accuracy. To illustrate this, three representative LHC processes have been computed at NLO QCD and EW: vector-boson production in association with jets, off-shell Z-boson pair production, and the production of a top-quark pair in association with a Higgs boson. In addition to fixed-order computations, when considering QCD corrections, all functionalities of Sherpa, i.e. particle decays, QCD parton showers, hadronisation, underlying events, etc. can be used in combination with Recola. This is demonstrated by the merging and matching of one-loop QCD matrix elements for Drell-Yan production in association with jets to the parton shower. The implementation is fully automatised, thus making it a perfect tool for both experimentalists and theorists who want to use state-of-the-art predictions at NLO accuracy.Comment: 38 pages, 29 figures. Matches the published version (few typos corrected

    MOSAiC goes O2A - Arctic Expedition Data Flow from Observations to Archives

    Get PDF
    During the largest polar expedition in history starting in September 2019, the German research icebreaker Polarstern spends a whole year drifting with the ice through the Arctic Ocean. The MOSAiC expedition takes the closest look ever at the Arctic even throughout the polar winter to gain fundamental insights and most unique on-site data for a better understanding of global climate change. Hundreds of researchers from 20 countries are involved. Scientists will use the in situ gathered data instantaneously in near-real time modus as well as long afterwards all around the globe taking climate research to a completely new level. Hence, proper data management, sampling strategies beforehand, and monitoring actual data flow as well as processing, analysis and sharing of data during and long after the MOSAiC expedition are the most essential tools for scientific gain and progress. To prepare for that challenge we adapted and integrated the research data management framework O2A “Data flow from Observations to Archives” to the needs of the MOSAiC expedition on board Polarstern as well as on land for data storage and access at the Alfred Wegener Institute Computing and Data Center in Bremerhaven, Germany. Our O2A-framework assembles a modular research infrastructure comprising a collection of tools and services. These components allow researchers to register all necessary sensor metadata beforehand linked to automatized data ingestion and to ensure and monitor data flow as well as to process, analyze, and publish data to turn the most valuable and uniquely gained arctic data into scientific outcomes. The framework further allows for the integration of data obtained with discrete sampling devices into the data flow. These requirements have led us to adapt the generic and cost-effective framework O2A to enable, control, and access the flow of sensor observations to archives in a cloud-like infrastructure on board Polarstern and later on to land based repositories for international availability. Major roadblocks of the MOSAiC-O2A data flow framework are (i) the increasing number and complexity of research platforms, devices, and sensors, (ii) the heterogeneous interdisciplinary driven requirements towards, e. g., satellite data, sensor monitoring, in situ sample collection, quality assessment and control, processing, analysis and visualization, and (iii) the demand for near real time analyses on board as well as on land with limited satellite bandwidth. The key modules of O2A's digital research infrastructure established by AWI are implementing the FAIR principles: SENSORWeb, to register sensor applications and sampling devices and capture controlled meta data before and alongside any measurements in the field Data ingest, allowing researchers to feed data into storage systems and processing pipelines in a prepared and documented way, at best in controlled near real-time data streams Dashboards allowing researchers to find and access data and share and collaborate among partners Workspace enabling researchers to access and use data with research software utilizing a cloud-based virtualized infrastructure that allows researchers to analyze massive amounts of data on the spot Archiving and publishing data via repositories and Digital Object Identifiers (DOI

    Cleaner burning aviation fuels can reduce contrail cloudiness

    Get PDF
    Contrail cirrus account for the major share of aviation’s climate impact. Yet, the links between jet fuel composition, contrail microphysics and climate impact remain unresolved. Here we present unique observations from two DLR-NASA aircraft campaigns that measured exhaust and contrail characteristics of an Airbus A320 burning either standard jet fuels or low aromatic sustainable aviation fuel blends. Our results show that soot particles can regulate the number of contrail cirrus ice crystals for current emission levels. We provide experimental evidence that burning low aromatic sustainable aviation fuel can result in a 50 to 70% reduction in soot and ice number concentrations and an increase in ice crystal size. Reduced contrail ice numbers cause less energy deposition in the atmosphere and less warming. Meaningful reductions in aviation’s climate impact could therefore be obtained from the widespread adoptation of low aromatic fuels, and from regulations to lower the maximum aromatic fuel content

    A stepwise approach to integrate climate data analysis workflows into e-science infrastructures

    Get PDF
    Large scale climate data analysis could not be constrained to a single data archive. Rather, linking many different data archives delivers new insights into the fundamental processes in climate science. Thus infrastructures emerged supporting the development of distributed climate data analysis workflows by providing integrated data and workflow management middleware. Two examples of this development are the ESGF infrastructure and the C3Grid. The ESGF data federation is based on an international effort to establish a distributed archive serving climate model data from large climate modeling intercomparison projects like CMIP5. C3Grid was developed as a national initiative for a climate data processing and data access infrastructure in Germany, integrating national climate data centers with compute providers. While no support for processing workflows is currently integrated into the ESGF infrastructure (some development efforts are on the way), the C3Grid offers several processing features with different diagnostic workflows. The experiences in this context show the complexity and fragility of efforts to provide operational workflows relying on a distributed e-science infrastructure composed of middleware services interacting with distributed compute and storage resources. Based on this experience we present a generic workflow integration process. The basic idea is to modularize the necessary integration steps of workflows into the C3Grid e-science infrastructure. The presented workflows also use data sources integrated in the worldwide ESGF data federation. In a nutshell the process is structured as follows: 1) local development and test of a data analysis components / workflows with example data 2) Exposition of the workflows as a processing web service based on the WPS OGC standard using locally hosted data 3) Generalizing the workflows by explicit inclusion of data staging tasks (thus extension to externally hosted data) alongside with persistent data identification (e.g. for provenance tracking of generated data products) 4) Development of a workflow parametrization GUI 5) Integration of workflows and GUI into the C3Grid infrastructure The key advantages of this process are: - early access and testing of newly developed functionality - ease of access and reuse of analysis components (with a restricted set data sources) independent of the C3Grid middleware e.g. for testing and research - additional effort for integration into the C3Grid infrastructure is only done for well tested workflows which have reached production status - clear interaction points between domain scientists ("workflow developers") and e-science experts ("infrastructure developers") The approach is illustrated by workflows developed by the climate service center (CSC)

    Fixed-order and merged parton-shower predictions for WW and WWj production at the LHC including NLO QCD and EW corrections

    Get PDF
    First, we present a combined analysis of pp →Ό+vÎŒe−vÂŻÂŻÂŻe and pp →Ό+vÎŒe−vÂŻÂŻÂŻej at next-to-leading order, including both QCD and electroweak corrections. Second, we provide all-order predictions for pp →Ό+vÎŒe−vÂŻÂŻÂŻe+jets using merged parton-shower simulations that also include approximate EW effects. A fully inclusive sample for WW production is compared to the fixed-order computations for exclusive zero- and one-jet selections. The various higher-order effects are studied in detail at the level of cross sections and differential distributions for realistic experimental set-ups. Our study confirms that merged predictions are significantly more stable than the fixed-order ones in particular regarding ratios between the two processes

    Synchrotron microbeam irradiation induces neutrophil infiltration, thrombocyte attachment and selective vascular damage in vivo.

    Get PDF
    Our goal was the visualizing the vascular damage and acute inflammatory response to micro- and minibeam irradiation in vivo. Microbeam (MRT) and minibeam radiation therapies (MBRT) are tumor treatment approaches of potential clinical relevance, both consisting of parallel X-ray beams and allowing the delivery of thousands of Grays within tumors. We compared the effects of microbeams (25-100 Όm wide) and minibeams (200-800 Όm wide) on vasculature, inflammation and surrounding tissue changes during zebrafish caudal fin regeneration in vivo. Microbeam irradiation triggered an acute inflammatory response restricted to the regenerating tissue. Six hours post irradiation (6 hpi), it was infiltrated by neutrophils and fli1a(+) thrombocytes adhered to the cell wall locally in the beam path. The mature tissue was not affected by microbeam irradiation. In contrast, minibeam irradiation efficiently damaged the immature tissue at 6 hpi and damaged both the mature and immature tissue at 48 hpi. We demonstrate that vascular damage, inflammatory processes and cellular toxicity depend on the beam width and the stage of tissue maturation. Minibeam irradiation did not differentiate between mature and immature tissue. In contrast, all irradiation-induced effects of the microbeams were restricted to the rapidly growing immature tissue, indicating that microbeam irradiation could be a promising tumor treatment tool
    corecore