160 research outputs found

    Framework Proposal for a US Upstream Greenhouse Gas Tax with WTO-Compliant Border Adjustments

    Get PDF
    Discussions regarding policies to limit greenhouse gas (GHG) emissions have been ongoing for decades, and GHG policies of various types have been implemented for years in many countries. In practice, countries that adopt GHG policies utilize a portfolio that typically includes a mix of standards, subsidies, mandates and price-based policies, each directed at particular economic sectors. In view of obvious inefficiencies and lack of synergies resulting from the portfolio approach, economists and many others have convincingly argued that setting a price on carbon—and other GHG emissions—using an economy-wide, upstream GHG tax would be the most effective and efficient policy to address GHG emissions. Its effectiveness stems from being able to cover all emissions from production and use of fossil fuels by applying the tax on producers of coal, oil, and gas resources at the mine mouth and wellhead before they are combusted, rather than dealing with actual emissions from millions of individual sources and actors throughout the economy. Its efficiency stems from allowing markets, rather than the political process, to identify and implement the most cost-effective steps to reduce emissions through decisions that affect current operations and purchases, and through decisions now about investment, research and development to invent and deploy more effective solutions to reduce future GHG emissions. Myriad issues must be addressed to design and approve legislation to implement an upstream, economy-wide GHG tax. This report does not address that galaxy of challenges and opportunities. Rather, assuming that an upstream GHG tax could be implemented, the report addresses the challenge of border adjustments for exports and imports in the context of a domestic upstream GHG tax, as described below. The domestic GHG tax could cause energy-intensive industries to shift production to countries without comparable pricing, resulting in “leakage” of GHG emissions that the domestic tax aims to prevent. By shifting production from the United States, the tax would also disadvantage domestic manufacturers, their employees, and the communities where they operate. Hence, the call by many to introduce border adjustments: through the imposition of equivalent GHG pricing on imported products from energy-intensive, trade-exposed (EITE) industries, and by providing rebates from the impact of the upstream tax on the cost of products exported by domestic producers. However, doing this has raised concerns about consistency with rules of the World Trade Organization (WTO). Here we propose a Framework for a US climate policy with border adjustments that are compatible with US obligations under WTO agreements. It is based on an upstream tax on GHG emissions with rebates for exports and charges on imports for products from EITE industries. A companion Compendium (forthcoming) provides additional details on implementing border adjustments with specific recommendations for 35 EITE industries. Proposed border measures are designed in a non-discriminatory fashion, with the intent and effect of reducing global GHG emissions. Therefore, the border adjustments proposed as part of the Framework will not give rise to any valid claims of WTO violations. Even if such claims should be raised, a strong defense could be made under the exceptions to the WTO rules

    The effect of functional roles on perceived group efficiency during computer-supported collaborative learning

    Get PDF
    In this article, the effect of functional roles on group performance and collaboration during computer-supported collaborative learning (CSCL) is investigated. Especially the need for triangulating multiple methods is emphasised: Likert-scale evaluation questions, quantitative content analysis of e-mail communication and qualitative analysis of open-ended questions were used. A comparison of fourty-one questionnaire observations, distributed over thirteen groups in two research conditions – groups with prescribed functional roles (n = 7, N = 18) and nonrole groups (n = 6, N = 23) – revealed no main effect for performance (grade). Principal axis factoring of the Likert-scales revealed a latent variable that was interpreted as perceived group efficiency (PGE). Multilevel modelling (MLM) yielded a positive marginal effect of PGE. Most groups in the role condition report a higher degree of PGE than nonrole groups. Content analysis of e-mail communication of all groups in both conditions (role n = 7, N = 25; nonrole n = 6, N = 26) revealed that students in role groups contribute more ‘coordination’ focussed statements. Finally, results from cross case matrices of student responses to open-ended questions support the observed marginal effect that most role groups report a higher degree of perceived group efficiency than nonrole groups

    Content analysis: What are they talking about?

    Get PDF
    Quantitative content analysis is increasingly used to surpass surface level analyses in Computer-Supported Collaborative Learning (e.g., counting messages), but critical reflection on accepted practice has generally not been reported. A review of CSCL conference proceedings revealed a general vagueness in definitions of units of analysis. In general, arguments for choosing a unit were lacking and decisions made while developing the content analysis procedures were not made explicit. In this article, it will be illustrated that the currently accepted practices concerning the ‘unit of meaning’ are not generally applicable to quantitative content analysis of electronic communication. Such analysis is affected by ‘unit boundary overlap’ and contextual constraints having to do with the technology used. The analysis of e-mail communication required a different unit of analysis and segmentation procedure. This procedure proved to be reliable, and the subsequent coding of these units for quantitative analysis yielded satisfactory reliabilities. These findings have implications and recommendations for current content analysis practice in CSCL research

    Principles of International Law Relevant for Consideration in the Design and Implementation of Trade-Related Climate Measures and Policies. Report of an International Legal Expert Group.

    Get PDF
    The report offers independent guidance for governments and stakeholders by eminent legal experts on principles of international law relevant for consideration in the design and implementation of trade-related climate measures and policies. The report reviews a set of recognized principles of international law that the expert group deems especially relevant for consideration including: Sovereignty; Prevention; Cooperation; Prohibition of Arbitrary & Unjustifiable Discrimination; Sustainable Development, Equity, & CBDR-RC; and Transparency & Consultation. The vision driving this report is that shared understandings on such principles could help foster dialogue and international cooperation on the design and implementation of trade-related climate measures and policies in the context of sustainable development priorities. According to the expert group, trade-related climate measures and policies should be approached as legal hybrids. Their rationale, design, and the debates about them draw from different areas of international law relating to the environment, climate, international trade and general international law. The principles are analysed in a way that presents them as cumulative and simultaneously applicable, in a mutually supportive and coherent manner, giving full effect to all relevant parts of international law, insofar as possible

    Context-aware modeling of neuronal morphologies

    Get PDF
    © 2014 Torben-Nielsen and De Schutter. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these termsNEURONAL MORPHOLOGIES ARE PIVOTAL FOR BRAIN FUNCTIONING: physical overlap between dendrites and axons constrain the circuit topology, and the precise shape and composition of dendrites determine the integration of inputs to produce an output signal. At the same time, morphologies are highly diverse and variant. The variance, presumably, originates from neurons developing in a densely packed brain substrate where they interact (e.g., repulsion or attraction) with other actors in this substrate. However, when studying neurons their context is never part of the analysis and they are treated as if they existed in isolation. Here we argue that to fully understand neuronal morphology and its variance it is important to consider neurons in relation to each other and to other actors in the surrounding brain substrate, i.e., their context. We propose a context-aware computational framework, NeuroMaC, in which large numbers of neurons can be grown simultaneously according to growth rules expressed in terms of interactions between the developing neuron and the surrounding brain substrate. As a proof of principle, we demonstrate that by using NeuroMaC we can generate accurate virtual morphologies of distinct classes both in isolation and as part of neuronal forests. Accuracy is validated against population statistics of experimentally reconstructed morphologies. We show that context-aware generation of neurons can explain characteristics of variation. Indeed, plausible variation is an inherent property of the morphologies generated by context-aware rules. We speculate about the applicability of this framework to investigate morphologies and circuits, to classify healthy and pathological morphologies, and to generate large quantities of morphologies for large-scale modeling.Peer reviewe

    New genetic loci implicated in fasting glucose homeostasis and their impact on type 2 diabetes risk.

    Get PDF
    Levels of circulating glucose are tightly regulated. To identify new loci influencing glycemic traits, we performed meta-analyses of 21 genome-wide association studies informative for fasting glucose, fasting insulin and indices of beta-cell function (HOMA-B) and insulin resistance (HOMA-IR) in up to 46,186 nondiabetic participants. Follow-up of 25 loci in up to 76,558 additional subjects identified 16 loci associated with fasting glucose and HOMA-B and two loci associated with fasting insulin and HOMA-IR. These include nine loci newly associated with fasting glucose (in or near ADCY5, MADD, ADRA2A, CRY2, FADS1, GLIS3, SLC2A2, PROX1 and C2CD4B) and one influencing fasting insulin and HOMA-IR (near IGF1). We also demonstrated association of ADCY5, PROX1, GCK, GCKR and DGKB-TMEM195 with type 2 diabetes. Within these loci, likely biological candidate genes influence signal transduction, cell proliferation, development, glucose-sensing and circadian regulation. Our results demonstrate that genetic studies of glycemic traits can identify type 2 diabetes risk loci, as well as loci containing gene variants that are associated with a modest elevation in glucose levels but are not associated with overt diabetes

    Translating Pharmacogenomics: Challenges on the Road to the Clinic

    Get PDF
    Pharmacogenomics is one of the first clinical applications of the postgenomic era. It promises personalized medicine rather than the established “one size fits all” approach to drugs and dosages. The expected reduction in trial and error should ultimately lead to more efficient and safer drug therapy. In recent years, commercially available pharmacogenomic tests have been approved by the Food and Drug Administration (FDA), but their application in patient care remains very limited. More generally, the implementation of pharmacogenomics in routine clinical practice presents significant challenges. This article presents specific clinical examples of such challenges and discusses how obstacles to implementation of pharmacogenomic testing can be addressed
    corecore