10 research outputs found

    Framing Transparency in the U.S. – Cross-media Analysis of the Debate on WikiLeaks

    Get PDF
    Some scholars have argued that the novelty of WikiLeaks for transparency is its usage of new technologies, acting as an example for the start of a new technological information era. Cull (2011), for instance, has argued that WikiLeaks exemplifies a “shift in power” made possible by “the technological revolution” that “has given one individual the communication power that was the monopoly of the nation state in the previous century” (pp. 2-3). As Bunz (2011) further outlines, Wikileaks shows how information from one can be send to many through the “digitalisation of knowledge” (pp. 139-140), whereby it has become easy to transport a great amount of information using minimal space. Furthermore, it demonstrates that the autopsy of data has become easy as programs help us order and analyse information. It is also an example of how the Internet has made it possible for anyone to publish and access information at any time (ibid.). In this chapter, we suggest that technology alone is insufficient to create different transparency standards and change the way politics are conducted. In line with Florini (2002), we believe that “transparency is a choice, encouraged by changing attitudes about what constitutes appropriate behavior” (p. 13). Thus, new technology must be accompanied by a change in attitudes, as “without a norm of transparency, technology will continue to protect private information as well as ferret it out” (Florini, 2002, p. 15). Studying the debate triggered by WikiLeaks presents the opportunity to examine whether its revelations have strengthened transparency in public perception, or if WikiLeaks is no more than the example of new technological means without any real impact on the discursive boundary between publicity and secrecy

    Ensemble-based stochastic permeability and flow simulation of a sparsely sampled hard-rock aquifer supported by high performance computing

    No full text
    Calibrating the heterogeneous permeability distribution of hard-rock aquifers based on sparse data is challenging but crucial for obtaining meaningful groundwater flow models. This study demonstrates the applicability of stochastic sampling of the prior permeability distribution and Metropolis sampling of the posterior permeability distribution using typical production data and measurements available in the context of groundwater abstraction. The case study is the Hastenrather Graben groundwater abstraction site near Aachen, Germany. A three-dimensional numerical flow model for the Carboniferous hard-rock aquifer is presented. Monte Carlo simulations are performed, for generating 1,000 realizations of the heterogeneous hard-rock permeability field, applying Sequential Gaussian Simulation based on nine log-permeability values for the geostatistical simulation. Forward simulation of flow during a production test for each realization results in the prior ensemble of model states verified by observation data in four wells. The computationally expensive ensemble simulations were performed in parallel with the simulation code SHEMAT-Suite on the high-performance computer JURECA. Applying a Metropolis sampler based on the misfit between drawdown simulations and observations results in a posterior ensemble comprising 251 realizations. The posterior mean log-permeability is −11.67 with an uncertainty of 0.83. The corresponding average posterior uncertainty of the drawdown simulation is 1.1 m. Even though some sources of uncertainty (e.g. scenario uncertainty) remain unquantified, this study is an important step towards an entire uncertainty quantification for a sparsely sampled hard-rock aquifer. Further, it provides a real-case application of stochastic hydrogeological approaches demonstrating how to accomplish uncertainty quantification of subsurface flow models in practice.Helmholtz-Gemeinschaft http://dx.doi.org/10.13039/50110000165

    Ensemble-based stochastic permeability and flow simulation of a sparsely sampled hard-rock aquifer supported by high performance computing

    No full text
    Calibrating the heterogeneous permeability distribution of hard-rock aquifers based on sparse data is challenging but crucial for obtaining meaningful groundwater flow models. This study demonstrates the applicability of stochastic sampling of the prior permeability distribution and Metropolis sampling of the posterior permeability distribution using typical production data and measurements available in the context of groundwater abstraction. The case study is the Hastenrather Graben groundwater abstraction site near Aachen, Germany. A three-dimensional numerical flow model for the Carboniferous hard-rock aquifer is presented. Monte Carlo simulations are performed, for generating 1,000 realizations of the heterogeneous hard-rock permeability field, applying Sequential Gaussian Simulation based on nine log-permeability values for the geostatistical simulation. Forward simulation of flow during a production test for each realization results in the prior ensemble of model states verified by observation data in four wells. The computationally expensive ensemble simulations were performed in parallel with the simulation code SHEMAT-Suite on the high-performance computer JURECA. Applying a Metropolis sampler based on the misfit between drawdown simulations and observations results in a posterior ensemble comprising 251 realizations. The posterior mean log-permeability is −11.67 with an uncertainty of 0.83. The corresponding average posterior uncertainty of the drawdown simulation is 1.1 m. Even though some sources of uncertainty (e.g. scenario uncertainty) remain unquantified, this study is an important step towards an entire uncertainty quantification for a sparsely sampled hard-rock aquifer. Further, it provides a real-case application of stochastic hydrogeological approaches demonstrating how to accomplish uncertainty quantification of subsurface flow models in practice

    Lung Nodules Missed in Initial Staging of Breast Cancer Patients in PET/MRI—Clinically Relevant?

    No full text
    Purpose: The evaluation of the clinical relevance of missed lung nodules at initial staging of breast cancer patients in [18F]FDG-PET/MRI compared with CT. Methods: A total of 152 patients underwent an initial whole-body [18F]FDG-PET/MRI and a thoracoabdominal CT for staging. Presence, size, shape and location for each lung nodule in [18F]FDG-PET/MRI was noted. The reference standard was established by taking initial CT and follow-up imaging into account (a two-step approach) to identify clinically-relevant lung nodules. Patient-based and lesion-based data analysis was performed. Results: No patient with clinically-relevant lung nodules was missed on a patient-based analysis with MRI VIBE, while 1/84 females was missed with MRI HASTE (1%). Lesion-based analysis revealed 4/96 (4%, VIBE) and 8/138 (6%, HASTE) missed clinically-relevant lung nodules. The average size of missed lung nodules was 3.2 mm ± 1.2 mm (VIBE) and 3.6 mm ± 1.4 mm (HASTE) and the predominant location was in the left lower quadrant and close to the hilum. Conclusion: All patients with newly-diagnosed breast cancer and clinically-relevant lung nodules were detected at initial [18F]FDG-PET/MRI staging. However, due to the lower sensitivity in detecting lung nodules, a small proportion of clinically-relevant lung nodules were missed. Thus, supplemental low-dose chest CT after neoadjuvant therapy should be considered for backup
    corecore