1,176 research outputs found

    A Model of Regulatory Burden in Technology Diffusion: The Case of Plant-Derived Vaccines.

    Get PDF
    Plant-derived vaccines may soon displace conventional vaccines. Assuming there are no major technological barriers undermining the feasibility of this innovative technology, it is worthwhile to generate quantitative models of regulatory burden of producing and diffusing plant-derived vaccines in industrialized and developing countries. A dynamic simulation model of technology diffusion—and the data to populate it—has been generated for studying regulatory barriers in the diffusion of plant-derived vaccines. The role of regulatory burden is evaluated for a variety of scenarios in which plant-derived vaccines are produced and diffused. This model relates the innovative and conventional vaccine technologies and the effects of the impact of the uptake of the innovative technology on mortality and morbidity. This case study demonstrates how dynamic simulation models can be used to assess the long-term potential impact of novel technologies in terms of a variety of socio-economic indicators.dynamic simulation model; plant-derived vaccines; regulatory burden; technology transfer; vaccines;

    Statistical Properties of DLAs and sub-DLAs

    Get PDF
    Quasar absorbers provide a powerful observational tool with which to probe both galaxies and the intergalactic medium up to high redshift. We present a study of the evolution of the column density distribution, f(N,z), and total neutral hydrogen mass in high-column density quasar absorbers using data from a recent high-redshift survey for damped Lyman-alpha (DLA) and Lyman limit system (LLS) absorbers. Whilst in the redshift range 2 to 3.5, ~90% of the neutral HI mass is in DLAs, we find that at z>3.5 this fraction drops to only 55% and that the remaining 'missing' mass fraction of the neutral gas lies in sub-DLAs with N(HI) 10^{19} - 2 * 10^{20} cm^{-2}.Comment: 6 pages, 4 figures, in "Chemical Enrichment of Intracluster and Intergalactic medium", Proceedings of the Vulcano Workshop, May 14-18, 200

    Pathways toward Zero-Carbon Electricity Required for Climate Stabilization

    Get PDF
    World Bank Policy Research Working Paper 7075This paper covers three policy-relevant aspects of the carbon content of elec-tricity that are well established among integrated assessment models but under-discussed in the policy debate. First, climate stabilization at any level from 2 ‱ C to 3 ‱ C requires electricity to be almost carbon-free by the end of the century. As such, the question for policy makers is not whether to decarbonize electricity but when to do it. Second, decarbonization of electricity is still possible and required if some of the key zero-carbon technologies — such as nuclear power or carbon capture and storage — turn out to be unavailable. Third, progres-sive decarbonization of electricity is part of every country's cost-effective means of contributing to climate stabilization. In addition, this paper provides cost-effective pathways of the carbon content of electricity — computed from the results of AMPERE, a recent integrated assessment model comparison study. These pathways may be used to benchmark existing decarbonization targets, such as those set by the European Energy Roadmap or the Clean Power Plan in the United States, or inform new policies in other countries. These pathways can also be used to assess the desirable uptake rates of electrification technolo-gies, such as electric and plug-in hybrid vehicles, electric stoves and heat pumps, or industrial electric furnaces

    Path-tracing Monte Carlo Library for 3D Radiative Transfer in Highly Resolved Cloudy Atmospheres

    Full text link
    Interactions between clouds and radiation are at the root of many difficulties in numerically predicting future weather and climate and in retrieving the state of the atmosphere from remote sensing observations. The large range of issues related to these interactions, and in particular to three-dimensional interactions, motivated the development of accurate radiative tools able to compute all types of radiative metrics, from monochromatic, local and directional observables, to integrated energetic quantities. In the continuity of this community effort, we propose here an open-source library for general use in Monte Carlo algorithms. This library is devoted to the acceleration of path-tracing in complex data, typically high-resolution large-domain grounds and clouds. The main algorithmic advances embedded in the library are those related to the construction and traversal of hierarchical grids accelerating the tracing of paths through heterogeneous fields in null-collision (maximum cross-section) algorithms. We show that with these hierarchical grids, the computing time is only weakly sensitivive to the refinement of the volumetric data. The library is tested with a rendering algorithm that produces synthetic images of cloud radiances. Two other examples are given as illustrations, that are respectively used to analyse the transmission of solar radiation under a cloud together with its sensitivity to an optical parameter, and to assess a parametrization of 3D radiative effects of clouds.Comment: Submitted to JAMES, revised and submitted again (this is v2

    Fully automated precision predictions for heavy neutrino production mechanisms at hadron colliders

    Get PDF
    Motivated by TeV-scale neutrino mass models, we propose a systematic treatment of heavy neutrino (N) production at hadron colliders. Our simple and efficient modeling of the vector boson fusion (VBF) WÎł → Nl and Nl ĂŸ nj signal definitions resolve collinear and soft divergences that have plagued past studies, and is applicable to other color-singlet processes, e.g., associated Higgs Ă°WhÞ, sparticle Ă°l~ Îœ~ lÞ, and charged Higgs Ă°hh∓Þ production. We present, for the first time, a comparison of all leading N production modes, including both gluon fusion (GF) gg → Z=h → NÎœl ð−Þ and VBF. We obtain fully differential results up to next-to-leading order (NLO) in QCD accuracy using a Monte Carlo tool chain linking FEYNRULES, NLOCT, and MADGRAPH5_AMC@NLO. Associated model files are publicly available. At the 14 TeV LHC, the leading order GF rate is small and comparable to the NLO Nl ĂŸ 1j rate; at a future 100 TeV Very Large Hadron Collider, GF dominates for mN ÂŒ 300–1500 GeV, beyond which VBF takes the lead

    On the precision of noise correlation interferometry.

    Get PDF
    International audienceLong duration noisy-looking waveforms such as those obtained in randomly multiple scattering and reverberant media are complex; they resist direct interpretation. Nevertheless, such waveforms are sensitive to small changes in the source of the waves or in the medium in which they propagate. Monitoring such waveforms, whether obtained directly or obtained indirectly by noise correlation, is emerging as a technique for detecting changes in media. Interpretation of changes is in principle problematic; it is not always clear whether a change is due to sources or to the medium. Of particular interest is the detection of small changes in propagation speeds. An expression is derived here for the apparent, but illusory, waveform dilation due to a change of source. The expression permits changes in waveforms due to changes in wave speed to be distinguished with high precision from changes due to other reasons. The theory is successfully compared with analysis of a laboratory ultrasonic data set and a seismic data set from Parkfield California

    First Application of the Dakin-West Reaction to Fmoc Chemistry: Synthesis of the ketomethylene tripeptide Fmoc-Nα-Asp(tBu)-(R,S)Tyr(tBu)Κ(CO-CH2)Gly-OH

    Get PDF
    International audienceA practical synthesis of a tripeptide containing a ketomethylene isostere, suitably protected for introduction in Fmoc SPPS, has been carried out for the first time in Fmoc chemistry by using a modified Dakin-West reaction
    • 

    corecore