70 research outputs found

    An integrated crisis communication framework for strategic crisis communication with the media: A case study on a financial services provider

    Get PDF
    In order for organisations to survive in an ever-changing milieu in the current business environment, sufficient crisis communication and management practices need to be in place to ensure organisational survival. Despite the latter, organisational crises are often inefficiently managed which could be ascribed to the lack of managing crises strategically (Kash & Darling 1998:180). This article explores the lack of strategic crisis communication processes to ensure effective crisis communication with the media as stakeholder group. It is argued that the media is one of the main influences of public opinion (Pollard & Hotho 2006:725), thereby emphasising the need for accurate distribution of information. Furthermore, the study will focus specifically on the financial industry, as it is believed that this industry is more sensitive and thus more prone towards media reporting as financial services providers manage people’s money (Squier 2009). A strategic crisis communication process with the media is therefore proposed, facilitated through an integrated crisis communication framework, which focuses on a combination of Integrated Communication (IC) literature with emphasis on Grunig’s theory of communication excellence to build sustainable media relationships through two-way communication; and proposing a crisis communication process that has proactive, reactive and post-evaluative crisis communication stages, thereby moving away from crisis communication as a predominant reactive function.Communication Scienc

    Global carbon budget 2013

    Get PDF
    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil-fuel combustion and cement production (EFF) are based on energy statistics, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated for the first time in this budget with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2 and land cover change (some including nitrogen–carbon interactions). All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2003–2012), EFF was 8.6 ± 0.4 GtC yr−1, ELUC 0.9 ± 0.5 GtC yr−1, GATM 4.3 ± 0.1 GtC yr−1, SOCEAN 2.5 ± 0.5 GtC yr−1, and SLAND 2.8 ± 0.8 GtC yr−1. For year 2012 alone, EFF grew to 9.7 ± 0.5 GtC yr−1, 2.2% above 2011, reflecting a continued growing trend in these emissions, GATM was 5.1 ± 0.2 GtC yr−1, SOCEAN was 2.9 ± 0.5 GtC yr−1, and assuming an ELUC of 1.0 ± 0.5 GtC yr−1 (based on the 2001–2010 average), SLAND was 2.7 ± 0.9 GtC yr−1. GATM was high in 2012 compared to the 2003–2012 average, almost entirely reflecting the high EFF. The global atmospheric CO2 concentration reached 392.52 ± 0.10 ppm averaged over 2012. We estimate that EFF will increase by 2.1% (1.1–3.1%) to 9.9 ± 0.5 GtC in 2013, 61% above emissions in 1990, based on projections of world gross domestic product and recent changes in the carbon intensity of the economy. With this projection, cumulative emissions of CO2 will reach about 535 ± 55 GtC for 1870–2013, about 70% from EFF (390 ± 20 GtC) and 30% from ELUC (145 ± 50 GtC)

    Volume I. Introduction to DUNE

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. This TDR is intended to justify the technical choices for the far detector that flow down from the high-level physics goals through requirements at all levels of the Project. Volume I contains an executive summary that introduces the DUNE science program, the far detector and the strategy for its modular designs, and the organization and management of the Project. The remainder of Volume I provides more detail on the science program that drives the choice of detector technologies and on the technologies themselves. It also introduces the designs for the DUNE near detector and the DUNE computing model, for which DUNE is planning design reports. Volume II of this TDR describes DUNE\u27s physics program in detail. Volume III describes the technical coordination required for the far detector design, construction, installation, and integration, and its organizational structure. Volume IV describes the single-phase far detector technology. A planned Volume V will describe the dual-phase technology

    The DZHK research platform: maximisation of scientific value by enabling access to health data and biological samples collected in cardiovascular clinical studies

    Get PDF
    The German Centre for Cardiovascular Research (DZHK) is one of the German Centres for Health Research and aims to conduct early and guideline-relevant studies to develop new therapies and diagnostics that impact the lives of people with cardiovascular disease. Therefore, DZHK members designed a collaboratively organised and integrated research platform connecting all sites and partners. The overarching objectives of the research platform are the standardisation of prospective data and biological sample collections among all studies and the development of a sustainable centrally standardised storage in compliance with general legal regulations and the FAIR principles. The main elements of the DZHK infrastructure are web-based and central units for data management, LIMS, IDMS, and transfer office, embedded in a framework consisting of the DZHK Use and Access Policy, and the Ethics and Data Protection Concept. This framework is characterised by a modular design allowing a high standardisation across all studies. For studies that require even tighter criteria additional quality levels are defined. In addition, the Public Open Data strategy is an important focus of DZHK. The DZHK operates as one legal entity holding all rights of data and biological sample usage, according to the DZHK Use and Access Policy. All DZHK studies collect a basic set of data and biosamples, accompanied by specific clinical and imaging data and biobanking. The DZHK infrastructure was constructed by scientists with the focus on the needs of scientists conducting clinical studies. Through this, the DZHK enables the interdisciplinary and multiple use of data and biological samples by scientists inside and outside the DZHK. So far, 27 DZHK studies recruited well over 11,200 participants suffering from major cardiovascular disorders such as myocardial infarction or heart failure. Currently, data and samples of five DZHK studies of the DZHK Heart Bank can be applied for

    Deep Underground Neutrino Experiment (DUNE), far detector technical design report, volume III: DUNE far detector technical coordination

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. Volume III of this TDR describes how the activities required to design, construct, fabricate, install, and commission the DUNE far detector modules are organized and managed. This volume details the organizational structures that will carry out and/or oversee the planned far detector activities safely, successfully, on time, and on budget. It presents overviews of the facilities, supporting infrastructure, and detectors for context, and it outlines the project-related functions and methodologies used by the DUNE technical coordination organization, focusing on the areas of integration engineering, technical reviews, quality assurance and control, and safety oversight. Because of its more advanced stage of development, functional examples presented in this volume focus primarily on the single-phase (SP) detector module

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype
    • 

    corecore