2,773 research outputs found
Managing the Transition to Climate Stabilization
This paper builds upon recent work by the US Climate Change Science Program (CCSP). Among its products, the CCSP developed new emission projections for the major man-made greenhouse gases, explored the effects of emission limits on the energy system, and calculated the costs of various stabilization constraints to the economy. This paper applies one of the models used for that analysis to explore the sensitivity of the results to three potentially critical factors: the stabilization level, the policy design, and the availability and costs of low- to zero-emitting technologies. The major determinant of costs is likely to be something over which we have little control - Mother Nature. The choice of stabilization level will reflect our understanding of the science of global climate change. We have little control over many of the key bio-geophysical processes which, to a major extent, will determine what constitutes dangerous anthropogenic interference with the climate system. We consider two limits on radiative forcing, corresponding to stabilizing CO2 concentrations at approximately 450 ppmv and 550 ppmv. These levels have been chosen because of the fundamentally different nature of the challenge posed by each. In the case of the lower concentration limit, emission reductions will be required virtually immediately and annual GDP losses to the US could approach 5%. With the higher concentration limit, the pressure for a sharp reduction in near-term emissions is not as great. This offers some potential to reduce GDP losses. Indeed, we find that depending upon the concentration limit, implementing market mechanisms which take advantage of 'where' and 'when' flexibility can markedly reduce GDP losses, perhaps by as much as an order of magnitude. However, for a variety of reasons, our ability to realize such savings may be compromised. One possible impediment relates to the proximity to the target. If the limit is imminent, flexibility will be greatly reduced. The nature of the coalition and our willingness to permit 'borrowing' emission rights from the future will also affect the magnitude of the potential savings. As a result, the reduction in GDP losses from where and when flexibility may turn out to be only a small fraction of what has been previously estimated. Fortunately, the biggest opportunity for managing costs may come from something over which we do have considerable control. We find that investments in climate friendly technologies can reduce GDP losses to the US by a factor of two or more. At present, we have insufficient economically competitive substitutes for high carbon emitting technologies. The development of low- to zero-emitting alternatives will require both a sustained commitment on the part of the public sector upstream in the R&D chain and incentives for the private sector to bring the necessary technologies to the marketplace. Aside from helping to assure that environmental goals are met in an economically efficient manner, climate policy can also serve as an enabler of new technologies. By recognizing the acute shortage of low-cost substitutes, the long lead times required for development and deployment, and the market failures that impede technological progress, climate policy can play an important role in reducing the long-term costs of the transition.
Recommended from our members
Inter- and intra-specimen variability masks reliable temperature control on shell Mg/Ca ratios in laboratory and field cultured Mytilus edulis and Pecten maximus (bivalvia).
yesThe Mg/Ca ratios of biogenic calcite is commonly
seen as a valuable palaeo-proxy for reconstructing
past ocean temperatures. The temperature dependence of
Mg/Ca ratios in bivalve calcite has been the subject of contradictory
observations. The palaeoceanographic use of a
geochemical proxy is dependent on initial, rigorous calibration
and validation of relationships between the proxy
and the ambient environmental variable to be reconstructed.
Shell Mg/Ca ratio data are reported for the calcite of two bivalve
species, Mytilus edulis (common mussel) and Pecten
maximus (king scallop), which were grown in laboratory
culturing experiments at controlled and constant aquarium
seawater temperatures over a range from 10 to 20 C.
Furthermore, Mg/Ca ratio data of laboratory- and fieldgrown
M. edulis specimens were compared. Only a weak,
albeit significant, shell Mg/Ca ratio¿temperature relationship
was observed in the two bivalve species: M. edulis
(r2=0.37, p<0.001 for laboratory-cultured specimens and
r2=0.50, p<0.001 for field-cultured specimens) and P. maximus
(r2=0.21, p<0.001 for laboratory-cultured specimens
only). In the two species, shell Mg/Ca ratios were not found
to be controlled by shell growth rate or salinity. The Mg/Ca
ratios in the shells exhibited a large degree of variability
among and within species and individuals. The results suggest
that the use of bivalve calcite Mg/Ca ratios as a temperature
proxy is limited, at least in the species studied to
date. Such limitations are most likely due to the presence
of physiological effects on Mg incorporation in bivalve calcite.
The utilization is further limited by the great variability
both within and among shells of the same species that were
precipitated under the same ambient condition
Recommended from our members
Arcadia, a software development environment research project
The research objectives of the Arcadia project are two-fold: discovery and development of environment architecture principles and creation of novel software development tools, particularly powerful analysis tools, which will function within an environment built upon these architectural principles.Work in the architecture area is concerned with providing the framework to support integration while also supporting the often conflicting goal of extensibility. Thus, this area of research is directed toward achieving external integration by providing a consistent, uniform user interface, while still admitting customization and addition of new tools and interface functions. In an effort to also attain internal integration, research is aimed at developing mechanisms for structuring and managing the tools and data objects that populate a software development environment, while facilitating the insertion of new kinds of tools and new classes of objects.The unifying theme of work in the tools area is support for effective analysis at every stage of a software development project. Research is directed toward tools suitable for analyzing pre-implementation descriptions of software, software itself, and towards the production of testing and debugging tools. In many cases, these tools are specifically tailored for applicability to concurrent, distributed, or real-time software systems.The initial focus of Arcadia research is on creating a prototype environment, embodying the architectural principles, which supports Ada1 software development. This prototype environment is itself being developed in Ada.Arcadia is being developed by a consortium of researchers from the University of California at Irvine, the University of Colorado at Boulder, the University of Massachusetts at Amherst, TRW, Incremental Systems Corporation, and The Aerospace Corporation. This paper delineates the research objectives and describes the approaches being taken, the organization of the research endeavor, and current status of the work
An examination of potential controls on shell Mn-Ca ratios in the calcite of the bivalve Mytilus edulis
Mg/Ca within the shell of the bivalve Pecten maximus : a new palaeotemperature proxy and implications for bivalve shell geochemical proxies
Next generation software environments : principles, problems, and research directions
The past decade has seen a burgeoning of research and development in software environments. Conferences have been devoted to the topic of practical environments, journal papers produced, and commercial systems sold. Given all the activity, one might expect a great deal of consensus on issues, approaches, and techniques. This is not the case, however. Indeed, the term "environment" is still used in a variety of conflicting ways. Nevertheless substantial progress has been made and we are at least nearing consensus on many critical issues.The purpose of this paper is to characterize environments, describe several important principles that have emerged in the last decade or so, note current open problems, and describe some approaches to these problems, with particular emphasis on the activities of one large-scale research program, the Arcadia project. Consideration is also given to two related topics: empirical evaluation and technology transition. That is, how can environments and their constituents be evaluated, and how can new developments be moved effectively into the production sector
Weak Coupling Phase Diagram of the Two Chain Hubbard Model
We present a general method for determining the phase diagram of systems of a
finite number of one dimensional Hubbard--like systems coupled by
single--particle hopping with weak interactions. The technique is illustrated
by detailed calculations for the two--chain Hubbard model, providing the first
controlled results for arbitrary doping and inter-chain hopping. Of nine
possible states which could occur in such a spin-- ladder, we find seven
at weak coupling. We discuss the conditions under which the model can be
regarded as a one--dimensional analog of a superconductor.Comment: 5 pages, self-unpacking uuencoded compressed postscript file. Also
available on the WWW at http://rheims.itp.ucsb.edu/~balents/index.htm
Disease characteristics and treatment of patients with diabetes mellitus attending government health services in Indonesia, Peru, Romania and South Africa.
OBJECTIVE: To describe the characteristics and management of Diabetes mellitus (DM) patients from low- and middle-income countries (LMIC). METHODS: We systematically characterized consecutive DM patients attending public health services in urban settings in Indonesia, Peru, Romania and South Africa, collecting data on DM treatment history, complications, drug treatment, obesity, HbA1c, and cardiovascular risk profile; and assessing treatment gaps against relevant national guidelines. RESULTS: Patients (median 59 years, 62.9% female) mostly had type 2 diabetes (96%), half for >5 years (48.6%). Obesity (45.5%) and central obesity (females 84.8%; males 62.7%) were common. The median HbA1c was 8.7% (72 mmol/mol), ranging from 7.7% (61 mmol/mol; Peru) to 10.4% (90 mmol/mol; South Africa). Antidiabetes treatment included metformin (62.6%), insulin (37.8%), and other oral glucose-lowering drugs (34.8%). Disease complications included eyesight problems (50.4%), EGFR <60 ml/min (18.9%), heart disease (16.5%), and proteinuria (14.7%). Many had an elevated cardiovascular risk with elevated blood pressure (36%), LDL (71.0%), and smoking (13%), but few were taking antihypertensive drugs (47.1%), statins (28.5%) and aspirin (30.0%) when indicated. Few patients on insulin (8.0%), statins (8.4%) and antihypertensives (39.5%) reached treatment targets according to national guidelines. There were large differences between countries in terms of disease profile and medication use. CONCLUSION: DM patients in government clinics in four LMIC with considerable growth of DM have insufficient glycemic control, frequent macrovascular and other complications, and insufficient preventive measures for cardiovascular disease. These findings underline the need to identify treatment barriers and secure optimal DM care in such settings. This article is protected by copyright. All rights reserved
The antisaccade task as an index of sustained goal activation in working memory: modulation by nicotine
The antisaccade task provides a laboratory analogue of situations in which execution of the correct behavioural response requires the suppression of a more prepotent or habitual response. Errors (failures to inhibit a reflexive prosaccade towards a sudden onset target) are significantly increased in patients with damage to the dorsolateral prefrontal cortex and patients with schizophrenia. Recent models of antisaccade performance suggest that errors are more likely to occur when the intention to initiate an antisaccade is insufficiently activated within working memory. Nicotine has been shown to enhance specific working memory processes in healthy adults. MATERIALS AND METHODS: We explored the effect of nicotine on antisaccade performance in a large sample (N = 44) of young adult smokers. Minimally abstinent participants attended two test sessions and were asked to smoke one of their own cigarettes between baseline and retest during one session only. RESULTS AND CONCLUSION: Nicotine reduced antisaccade errors and correct antisaccade latencies if delivered before optimum performance levels are achieved, suggesting that nicotine supports the activation of intentions in working memory during task performance. The implications of this research for current theoretical accounts of antisaccade performance, and for interpreting the increased rate of antisaccade errors found in some psychiatric patient groups are discussed
The role of technology for achieving climate policy objectives: Overview of the EMF 27 study on global technology and climate policy strategies
- …