25 research outputs found

    The prophylactic role of intravenous and long-term oral acyclovir after allogeneic bone marrow transplantation.

    Get PDF
    Eighty-two patients were randomly allocated to receive intravenous acyclovir 5 mg kg-1 t.d.s. for 23 days followed by oral acyclovir 800 mg 6-hourly for 6 months or matching placebos after allogeneic bone marrow transplantation. Herpes simplex and varicella zoster virus infections were significantly reduced during the period of administration of acyclovir. No reduction in cytomegalovirus infection was demonstrated. The drug was not toxic

    Dynamics of power in contemporary media policy-making

    Get PDF
    Despite the growing interest in the organization and regulation of media industries, there is relatively little public discussion of the material processes through which media policy is developed. At a time of considerable change in the global media environment, new actors and new paradigms are emerging that are set to shift the balance of power between public and private interests in the policy-making process. This article focuses on some core challenges to the pluralist conception of public policy-making that still dominates today and considers whether key aspects of UK and American media policy-making can be said to be competitive, accessible, transparent or rational. Based on interviews with a wide range of ‘stakeholders’, the article assesses the power dynamics that underlie media policy-making and argues that the process is skewed by the taken-for-granted domination of market ideology

    Towards a fair comparison of statistical and dynamical downscaling in the framework of the EURO-CORDEX initiative

    Get PDF
    Both statistical and dynamical downscaling methods are well established techniques to bridge the gap between the coarse information produced by global circulation models and the regional-to-local scales required by the climate change Impacts, Adaptation, and Vulnerability (IAV) communities. A number of studies have analyzed the relative merits of each technique by inter-comparing their performance in reproducing the observed climate, as given by a number of climatic indices (e.g. mean values, percentiles, spells). However, in this paper we stress that fair comparisons should be based on indices that are not affected by the calibration towards the observed climate used for some of the methods. We focus on precipitation (over continental Spain) and consider the output of eight Regional Climate Models (RCMs) from the EURO-CORDEX initiative at 0.44? resolution and five Statistical Downscaling Methods (SDMs) ?analog resampling, weather typing and generalized linear models? trained using the Spain044 observational gridded dataset on exactly the same RCM grid. The performance of these models is inter-compared in terms of several standard indices ?mean precipitation, 90th percentile on wet days, maximum precipitation amount and maximum number of consecutive dry days? taking into account the parameters involved in the SDM training phase. It is shown, that not only the directly affected indices should be carefully analyzed, but also those indirectly influenced (e.g. percentile-based indices for precipitation) which are more difficult to identify. We also analyze how simple transformations (e.g. linear scaling) could be applied to the outputs of the uncalibrated methods in order to put SDMs and RCMs on equal footing, and thus perform a fairer comparison.We acknowledge the World Climate Research Programme’s Working Group on Regional Climate, and theWorking Group on CoupledModelling, former coordinating body of CORDEX and responsible panel for CMIP5. We also thank the climate modeling groups (listed in Table 1 of this paper) for producing and making available their model output. We also acknowledge the Earth System Grid Federation infrastructure and AEMET and University of Cantabria for the Spain02 dataset (available at http: //www.meteo.unican.es/en/datasets/spain02). All the statistical downscaling experiments have been computed using theMeteoLab software (http://www.meteo.unican.es/software/meteolab), which is an open-source Matlab toolbox for statistical downscaling. This work has been partially supported by CORWES (CGL2010-22158-C02) and EXTREMBLES (CGL2010-21869) projects funded by the Spanish R&D programme. AC thanks the Spanish Ministry of Economy and Competitiveness for the funding provided within the FPI programme (BES-2011-047612 and EEBB-I-13-06354), JMG acknowledges the support from the SPECS project (FP7-ENV-2012-308378) and JF is grateful to the EUPORIAS project (FP7-ENV-2012-308291). We also thank three anonymous referees for their useful comments that helped to improve the original manuscript

    Natural hazards in Australia: heatwaves

    Get PDF
    As part of a special issue on natural hazards, this paper reviews the current state of scientific knowledge of Australian heatwaves. Over recent years, progress has been made in understanding both the causes of and changes to heatwaves. Relationships between atmospheric heatwaves and large-scale and synoptic variability have been identified, with increasing trends in heatwave intensity, frequency and duration projected to continue throughout the 21st century. However, more research is required to further our understanding of the dynamical interactions of atmospheric heatwaves, particularly with the land surface. Research into marine heatwaves is still in its infancy, with little known about driving mechanisms, and observed and future changes. In order to address these knowledge gaps, recommendations include: focusing on a comprehensive assessment of atmospheric heatwave dynamics; understanding links with droughts; working towards a unified measurement framework; and investigating observed and future trends in marine heatwaves. Such work requires comprehensive and long-term collaboration activities. However, benefits will extend to the international community, thus addressing global grand challenges surrounding these extreme events

    Analogue switch-off: Multi-channel viewing by 'the reluctant 50%'

    No full text
    The paper is concerned with the switch-off of analogue television and explores key issues about technology, audiences and communications policy. The main argument is that the characteristics of the 'second 50%' are very different from the first half of households that have chosen to adopt digital, and that concerns are as much about content as cost. The paper reports a small, largely qualitative, study of households where analogue television has been switched off – the only place in the UK where this has happened, as a Department of Culture, Media and Sport (DCMS) trial. Qualitative data on the transformation of audience behaviour with the arrival of digital is contextualised by an analysis of British government policy on analogue switch-off. The paper compares and contrasts the discourse of digital TV with viewing expectations and experiences. It reflects on choice, viewing behaviour and the shaping of technology and raises critical questions about government policies on analogue switch-off

    The Influence of Software Process Maturity and Customer Error Reporting on Software Release and Pricing

    No full text
    Software producers are making greater use of customer error reporting to discover defects and improve the quality of their products. We study how software development differences among producers (e.g., varying levels of process maturity) and software class and functionality differences (e.g., operating system versus productivity software) affect how these producers coordinate software release timing and pricing to optimally harness error reporting contributions from users. In settings where prices are fixed, we characterize the optimal release time and demonstrate why in some cases it can actually be preferable to delay release when customer error reporting rates increase. The manner in which a firm's optimal release time responds to increases in software functionality critically hinges on whether the added functionality enhances or dilutes user error reporting; in both cases, the effect of added functionality on release timing can go in either direction, depending on both firm and product market characteristics. For example, when processing costs are relatively large compared with goodwill costs, firms with lower process maturity will release earlier when per-module error reporting contributions become diluted and release later when these contributions become enhanced. We also examine how a firm adapts price with changes in error reporting levels and software functionality, and finally, we provide implications of how beta testing influences release timing.This paper was accepted by Lorin Hitt, information systems
    corecore