82 research outputs found
Steering the Climate System: Using Inertia to Lower the Cost of Policy
Conventional wisdom holds that the efficient way to limit warming to a chosen level is to price carbon emissions at a rate that increases exponentially. We show that this “Hotelling” tax on carbon emissions is actually inefficient. The least-cost policy path takes advantage of the climate system’s inertia by growing more slowly than exponentially. Carbon dioxide temporarily overshoots the steady-state level consistent with the temperature limit, and the efficient carbon tax follows an inverse-U-shaped path. Economic models that assume exponentially increasing carbon taxes are overestimating the minimum cost of limiting warming, overestimating the efficient near-term carbon tax, and overvaluing technologies that mature sooner
Practical guidelines for rigor and reproducibility in preclinical and clinical studies on cardioprotection
The potential for ischemic preconditioning to reduce infarct size was first recognized more than 30 years ago. Despite extension of the concept to ischemic postconditioning and remote ischemic conditioning and literally thousands of experimental studies in various species and models which identified a multitude of signaling steps, so far there is only a single and very recent study, which has unequivocally translated cardioprotection to improved clinical outcome as the primary endpoint in patients. Many potential reasons for this disappointing lack of clinical translation of cardioprotection have been proposed, including lack of rigor and reproducibility in preclinical studies, and poor design and conduct of clinical trials. There is, however, universal agreement that robust preclinical data are a mandatory prerequisite to initiate a meaningful clinical trial. In this context, it is disconcerting that the CAESAR consortium (Consortium for preclinicAl assESsment of cARdioprotective therapies) in a highly standardized multi-center approach of preclinical studies identified only ischemic preconditioning, but not nitrite or sildenafil, when given as adjunct to reperfusion, to reduce infarct size. However, ischemic preconditioning—due to its very nature—can only be used in elective interventions, and not in acute myocardial infarction. Therefore, better strategies to identify robust and reproducible strategies of cardioprotection, which can subsequently be tested in clinical trials must be developed. We refer to the recent guidelines for experimental models of myocardial ischemia and infarction, and aim to provide now practical guidelines to ensure rigor and reproducibility in preclinical and clinical studies on cardioprotection. In line with the above guideline, we define rigor as standardized state-of-the-art design, conduct and reporting of a study, which is then a prerequisite for reproducibility, i.e. replication of results by another laboratory when performing exactly the same experiment
Broadband Multi-wavelength Properties of M87 during the 2017 Event Horizon Telescope Campaign
Abstract: In 2017, the Event Horizon Telescope (EHT) Collaboration succeeded in capturing the first direct image of the center of the M87 galaxy. The asymmetric ring morphology and size are consistent with theoretical expectations for a weakly accreting supermassive black hole of mass ∼6.5 × 109 M ⊙. The EHTC also partnered with several international facilities in space and on the ground, to arrange an extensive, quasi-simultaneous multi-wavelength campaign. This Letter presents the results and analysis of this campaign, as well as the multi-wavelength data as a legacy data repository. We captured M87 in a historically low state, and the core flux dominates over HST-1 at high energies, making it possible to combine core flux constraints with the more spatially precise very long baseline interferometry data. We present the most complete simultaneous multi-wavelength spectrum of the active nucleus to date, and discuss the complexity and caveats of combining data from different spatial scales into one broadband spectrum. We apply two heuristic, isotropic leptonic single-zone models to provide insight into the basic source properties, but conclude that a structured jet is necessary to explain M87’s spectrum. We can exclude that the simultaneous γ-ray emission is produced via inverse Compton emission in the same region producing the EHT mm-band emission, and further conclude that the γ-rays can only be produced in the inner jets (inward of HST-1) if there are strongly particle-dominated regions. Direct synchrotron emission from accelerated protons and secondaries cannot yet be excluded
The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance
INTRODUCTION
Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic.
RATIONALE
We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs).
RESULTS
Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants.
CONCLUSION
Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
Escape from Third-Best: Rating Emissions for Intensity Standards
An increasingly common type of environmental policy instrument regulates the carbon intensity of transportation and electricity markets. In order to extend the policy's scope beyond point-of-use emissions, regulators assign each potential fuel an emission intensity rating for use in calculating compliance. I show that welfare-maximizing ratings do not generally coincide with the best estimates of actual emissions. In fact, the regulator can achieve a higher level of welfare by properly selecting the emission ratings than possible by selecting only the level of the standard. Moreover, a fuel's optimal rating can actually decrease when its estimated emission intensity increases. Numerical simulations of the California Low-Carbon Fuel Standard suggest that when recent scientific information increased the estimated emissions from conventional ethanol, regulators should have lowered ethanol's rating (making it appear less emission-intensive) so that the fuel market would clear with a lower quantity.12 month embargo; published online: 24 February 2016This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Recommended from our members
General equilibrium rebound from energy efficiency innovation
Energy efficiency improvements "rebound" when economic responses undercut their direct energy savings. I show that general equilibrium channels typically amplify rebound by making consumption goods cheaper but typically dampen rebound by increasing demand for non-energy inputs to production and by changing the size of the energy supply sector. Improvements in the efficiency of the energy supply sector generate especially large rebound because they make energy cheaper in all other sectors. Quantitatively, general equilibrium channels reduce rebound in U.S. consumption good sectors from 39% to 28% but increase rebound in the energy supply sector from 42% to 80%. (C) 2020 Elsevier B.V. All rights reserved.24 month embargo; Published online: 8 April 2020This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Green Expectations: Current Effects of Anticipated Carbon Pricing
I report evidence that an anticipated strengthening of environmental policy increased emissions. I find that the breakdown of the U.S. Senate's 2010 climate effort generated positive excess returns in coal futures markets. This response appears to be driven by an increase in coal storage. The proposed legislation aimed to reduce U.S. greenhouse gas emissions after 2013, but the legislative process itself may have increased emissions by over 12 million tons of carbon dioxide leading up to April 2010.University of Arizona's Renewable Energy Network24 month embargo; Published online: 17 July 2017This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Replication data for: Green Expectations: Current Effects of Anticipated Carbon Pricing
Review of Economics and Statistics: Forthcomin
Recommended from our members
Characterizing and responding to uncertainty in climate change
The development and analysis of climate policy proposals intertwine with the structure of knowledge and the possibility for changing it. Key questions concern the long-term interaction between policy, technology, infrastructure, and the earth system, but each of these components is deeply uncertain. This dissertation advances the description of knowledge about the climate system, the assessment of economic responses to climatic possibilities, and the development of policy that positions society to achieve long-term climate goals. It offers new paths to describing understanding of complex systems and to modeling optimal management under structural uncertainty.The first chapter formalizes uncertainty about equilibrium climate change. Its hierarchical Bayes framework allows climate models to be incomplete and to share biases, and it shows how prior beliefs about models' completeness and independence interact with models' estimates of feedback strength to determine distributions for temperature change. When models might share biases, the results of additional models might tell us more about models' common structure than about the real-world processes they aim to represent. The most valuable information would then come not from related models but from alternate estimates that should carry a different set of unobservable biases. The possibility that models are wrong in common ways limits the degree to which models' estimates can narrow the probability distribution for feedback strength, which also limits our ability to rule out extreme climatic outcomes.The second chapter empirically estimates a feedback that is especially difficult to model. Climate-carbon feedbacks (or carbon cycle feedbacks) describe the effect of temperature on carbon dioxide (CO2). If they are positive, then not only does anthropogenic CO2 cause warming via the greenhouse effect and earth system feedbacks, but this warming itself increases CO2 and so causes further warming. Previous empirical work estimated a stronger feedback than did coupled climate-carbon cycle models. However, those empirical estimates were probably biased upwards while coupled models' estimates were primarily driven by a few ill-constrained parameters. This chapter attempts to obtain an unbiased estimate of climate-carbon feedback strength by using variations in summer radiation in the Arctic (i.e., variations in orbital forcing) to identify the effect of temperature on CO2 in 800 ky ice core records. It finds a range for climate-carbon feedbacks that is closer to coupled models' estimates than to previous empirical work. Since climate-carbon feedbacks are probably positive, temperature change projections tend to underestimate an emission path's consequences if they do not allow the carbon cycle to respond to changing temperatures.The next three chapters assess economic responses to climate change in a policy-optimizing integrated assessment model, in games with long-lived investments into abatement capital, and in a cost-effectiveness model with multiple policy options stretching over long time horizons. The first of these chapters extends a well-known integrated assessment model to include the possibility of abrupt shifts in the climate system. It also changes the model's structure to make the decision-maker aware of uncertainty and of the possibility for learning over time, and it generalizes the welfare evaluation to reflect that uncertainty about temperature change is qualitatively unlike uncertainty about climate thresholds. It finds that tipping points can increase the near-term social cost of carbon by more than 50% when they raise climate sensitivity or make damages more convex. They have less of an effect when they increase the atmospheric lifetime of CO2 or the quantity of non-CO2 greenhouse gases. Allowing the policymaker to be differentially averse to consumption fluctuations over time and over risk increases the near-term social cost of carbon by 150%, with tipping point possibilities then increasing it by another 50%. The possibility of tipping points is more important for the social cost of carbon than is the ambiguity attitude the decision-maker uses in evaluating them.The second of these climate economics chapters models the optimal emission tax when firms can adopt low-pollution technology that reduces abatement cost. The regulator anticipates this adoption but must set the tax before firms invest. In many cases, a linear emission tax cannot obtain both socially optimal investment and socially optimal emissions because the regulator either will set it inefficiently high to stimulate investment or will set it at an ex post optimal level that obtains inefficiently low investment. The difficulty is that an emission tax fixes both the incentive to invest and the incentive to abate, but these two goals rarely align perfectly when investment is lumpy. In contrast, tradable permits policies do not suffer this tension because the permit price responds automatically to realized investment. A numerical model then considers the ability of the regulator to select not only the level but also the duration of the tax. It shows that outcomes are still often socially inefficient. Further, the regulator will occasionally use a longer tax to obtain investment when firms expect their investments to lower the tax in the next period, but the cost of not being able to adjust the next period's tax limits the parameter space in which the longer tax is employed.The fifth chapter constructs cost-effective dynamic policy portfolios of abatement, research and development (R&D), and negative emission technology deployment in order to achieve 21st century climate targets. It includes two types of stochastic technological change in a stylized numerical model and allows each type of technology to respond both to public R&D and to abatement policies. It compares worlds where negative emission technologies are and are not available, and it compares a world where the century's cumulative net emissions are constrained with a world in which threshold possibilities lead policy to constrain cumulative net emissions in each year during the century. It finds that R&D options are valuable and exercised but do not substitute for near-term abatement. The type of R&D undertaken depends on long-term emission goals because those determine the magnitude of future abatement. When the cumulative emission constraint is stringent, negative emission technologies substitute for near-term abatement and affect the type of R&D undertaken, but if threshold considerations eliminate the freedom to temporarily overshoot emission targets, negative emission technologies become less valuable. The availability of negative emission technologies provides a valuable option to partially undo previous emissions, but abatement also gains option value from increasing future flexibility to forgo reliance on negative emission technologies if the technology or climate prove problematic in the interim.The concluding chapter directly connects uncertainty about climate change to uncertainty about the cost of achieving CO2 targets. It shows how beliefs about technology, temperature, and damages interact to affect the cost-effectiveness of climate targets. It finds that the speed with which damages increase at higher temperatures is the most important of these factors. Both 450 parts per million (ppm) and 550 ppm CO2 targets provide net benefits for quadratic damage functions that reduce annual output by less than the 1-2% estimated for 2.5°C of warming. Cubic damage functions support both CO2 targets even if 2.5°C of warming only reduces output by 0.2% or less. More convex damage functions also reduce the importance of abatement cost uncertainty. significantly increase the range of damage functions that support these targets and decrease the importance of abatement cost uncertainty. In addition, because extreme feedback outcomes have little effect over the next decades, a thinner-tailed temperature distribution (resulting from optimistic prior beliefs about climate models' independence and biases) supports CO2 targets under slightly less severe damages than does the thicker-tailed distribution (resulting from skepticism about climate models' independence and biases). Emission reductions hedge against greater societal sensitivity to temperature increases while exposing society to the upside of positive technology surprises.The epistemology of complex systems in an out-of-sample world is a key motif. This dissertation advances knowledge of climate change and understanding of policy design in settings with limited ability to predict future changes or responses. Further work should seek a more unified framework for describing and acting on knowledge of evolving complex systems
Escape from Third-Best: Rating Emissions for Intensity Standards
An increasingly common type of environmental policy instrument limits the carbon intensity of transportation and electricity markets. In order to extend the policy's scope beyond point-of-use emissions, regulators assign each competing fuel an emission intensity rating for use in calculating compliance. I show that welfare-maximizing ratings do not generally coincide with the best estimates of actual emissions. In fact, the regulator can achieve a higher level of welfare by manipulating the emission ratings than by manipulating the level of the standard. Moreover, a fuel's optimal rating can actually decrease when its estimated emission intensity increases. Numerical simulations of the California Low-Carbon Fuel Standard suggest that when recent scientific information suggested greater emissions from conventional ethanol, regulators should have lowered ethanol's rating (making it appear less emission-intensive) so that the fuel market would clear with a lower quantity
- …