306 research outputs found

    Growth form and leaf habit drive contrasting effects of Arctic amplification in long-lived woody species

    Get PDF
    Current global change is inducing heterogeneous warming trends worldwide, with faster rates at higher latitudes in the Northern Hemisphere. Consequently, tundra vegetation is experiencing an increase in growth rate and uneven but expanding distribution. Yet, the drivers of this heterogeneity in woody species responses are still unclear. Here, applying a retrospective approach and focusing on long-term responses, we aim to get insight into growth trends and climate sensitivity of long-lived woody species belonging to different functional types with contrasting growth forms and leaf habits (shrub vs. tree and deciduous vs. evergreen). A total of 530 samples from 7 species (common juniper, dwarf birch, woolly willow, Norway spruce, lodgepole pine, rowan, and downy birch) were collected in 10 sites across Iceland. We modelled growth trends and contrasted yearly ring-width measurements, filtering in high- and low-frequency components, with precipitation, land- and sea-surface temperature records (1967-2018). Shrubs and trees showed divergent growth trends, with shrubs closely tracking the recent warming, whereas trees, especially broadleaved, showed strong fluctuations but no long-term growth trends. Secondary growth, particularly the high-frequency component, was positively correlated with summer temperatures for most of the species. On the contrary, growth responses to sea surface temperature, especially in the low frequency, were highly diverging between growth forms, with a strong positive association for shrubs and a negative for trees. Within comparable vegetation assemblage, long-lived woody species could show contrasting responses to similar climatic conditions. Given the predominant role of oceanic masses in shaping climate patterns in the Arctic and Low Arctic, further investigations are needed to deepen the knowledge on the complex interplay between coastal tundra ecosystems and land-sea surface temperature dynamics

    Health-related quality of life of patients with implantable cardioverter defibrillators compared with that of pacemaker recipients

    Get PDF
    To access publisher full text version of this article. Please click on the hyperlink in Additional Links fieldAIMS: Studies indicate a poorer quality of life (QoL) for implantable cardioverter defibrillator (ICD) patients than for the general population. However, studies comparing the QoL of ICD patients with that of patients with other implantable cardiac devices are scarce. We hypothesized that ICD patients had a poorer QoL than pacemaker patients. METHODS AND RESULTS: All ICD patients living in Iceland at the beginning of 2002 (44 subjects), and a comparison group of 81 randomly selected patients with pacemakers were invited to participate. The Icelandic Quality of Life Questionnaire (IQL), the General Health Questionnaire (GHQ), the Beck Anxiety Inventory (BAI), and the Beck Depression Inventory (BDI) were submitted to measure QoL, psychiatric distress, and symptoms of anxiety and depression. The ICD and pacemaker groups did not differ on IQL, BAI, BDI, or GHQ scores. ICD patients were as a group more fearful of death (P = 0.056) and showed more concerns about returning to work (P = 0.072), although these items fell just short of statistical significance. CONCLUSION: Contrary to our expectations, ICD patients had a comparable QoL with pacemaker recipients and were not more likely to suffer from anxiety, depression, or general psychiatric distress. These findings are encouraging in view of expanding ICD indications

    Nominal GDP Targeting and the Zero Lower Bound: Should We Abandon Inflation Targeting?

    Full text link
    I compare nominal GDP level targeting to flexible inflation targeting in a small New Keynesian model subject to the zero lower bound on nominal policy rates. First, I study the performance of optimal discretionary policies. I find that, for a standard calibration, inflation targeting under discretion leaves the economy open to a deflationary trap. Nominal GDP level targeting under discretion, by contrast, provides a firm nominal anchor to the economy. Second, I study simple policy rules and the role of smoothing in the rules. With smoothing, a Taylor-type rule performs as well as a nominal GDP level rule. These result suggest that inflation targeting should not be ditched. Still, it can be improved significantly, by using policy rate smoothing to anchor inflation firmly

    Greece’s Three-Act Tragedy:A Simple Model of Grexit vs. Staying Afloat inside the Single Currency Area

    Get PDF
    Against the backdrop of the Greek three-act tragedy, we present a theoretical framework for studying Greece’s recent debt and currency crisis. The model is built on two essential blocks: first, erratic macroeconomic policymaking in Greece is described using a stochastic regimeswitching model; second, the euro area governments’ responses to uncertain macroeconomic policies in Greece are considered. The model’s mechanism and assumptions allow either for a Grexit from the euro area or, conversely, the avoidance of Greece’s default against its creditors. The model also offers useful guidance to understand key drivers of the long-winded negotiations between the Syiza government and the euro area governments

    What Fiscal Policy is Effective at Zero Interest Rates?

    Full text link
    Tax cuts can deepen a recession if the short-term nominal interest rate is zero, according to a standard New Keynesian business cycle model. An example of a contractionary tax cut is a reduction in taxes on wages. This tax cut deepens a recession because it increases deflationary pressures. Another example is a cut in capital taxes. This tax cut deepens a recession because it encourages people to save instead of spend at a time when more spending is needed. Fiscal policies aimed directly at stimulating aggregate demand work better. These policies include 1) a temporary increase in government spending; and 2) tax cuts aimed directly at stimulating aggregate demand rather than aggregate supply, such as an investment tax credit or a cut in sales taxes. The results are specific to an environment in which the interest rate is close to zero, as observed in large parts of the world today

    Modeling Historic Rangeland Management and Grazing Pressures in Landscapes of Settlement

    Get PDF
    Defining historic grazing pressures and rangeland management is vital if early landscape threshold crossing and long–term trajectories of landscape change are to be properly understood. In this paper we use a new environmental simulation model, Búmodel, to assess two contrasting historical grazing landscapes in Mývatnssveit Iceland for two key periods—the colonization period (ca. Landnám, A.D. 872–1000) and the early eighteenth century A.D. Results suggest that there were spatial and temporal variations in productivity and grazing pressure within and between historic grazing areas and indicate that land degradation was not an inevitable consequence of the livestock grazing introduced with settlement. The results also demonstrate the significance of grazing and livestock management strategies in preventing overgrazing, particularly under cooler climatic conditions. The model enables detailed consideration of historic grazing management scenarios and their associated landscape pressures

    The sequences of 150,119 genomes in the UK Biobank

    Get PDF
    Detailed knowledge of how diversity in the sequence of the human genome affects phenotypic diversity depends on a comprehensive and reliable characterization of both sequences and phenotypic variation. Over the past decade, insights into this relationship have been obtained from whole-exome sequencing or whole-genome sequencing of large cohorts with rich phenotypic data(1,2). Here we describe the analysis of whole-genome sequencing of 150,119 individuals from the UK Biobank(3). This constitutes a set of high-quality variants, including 585,040,410 single-nucleotide polymorphisms, representing 7.0% of all possible human single-nucleotide polymorphisms, and 58,707,036 indels. This large set of variants allows us to characterize selection based on sequence variation within a population through a depletion rank score of windows along the genome. Depletion rank analysis shows that coding exons represent a small fraction of regions in the genome subject to strong sequence conservation. We define three cohorts within the UK Biobank: a large British Irish cohort, a smaller African cohort and a South Asian cohort. A haplotype reference panel is provided that allows reliable imputation of most variants carried by three or more sequenced individuals. We identified 895,055 structural variants and 2,536,688 microsatellites, groups of variants typically excluded from large-scale whole-genome sequencing studies. Using this formidable new resource, we provide several examples of trait associations for rare variants with large effects not found previously through studies based on whole-exome sequencing and/or imputation

    The Macroeconomic Effects of the Euro Area's Fiscal Consolidation 2011-2013: A Simulation-Based Approach

    Full text link
    We simulate the Euro Area's fiscal consolidation between 2011 and 2013 by employing two DSGE models used by the ECB and the European Commission, respectively. The cumulative multiplier amounts to 0.7 and 1.0 in the baseline, but increases to 1.3 with a reasonably calibrated financial accelerator and a crisis-related increase of the share of liquidity constrained households. In the latter scenario, fiscal consolidation would be largely responsible for the decline in the output gap from 2011-2013. Postponing the fiscal consolidation to a period of unconstrained monetary policy (until after the economic recovery) would have avoided most of these losses.Wir simulieren die Haushaltskonsolidierung im Euroraum im Zeitraum 2011 bis 2013 mit Hilfe von zwei DSGE Modellen der EZB und der Europäischen Kommission. Der kumulative Multiplikator beträgt 0.7 bzw. 1.0 in der Basislinie, steigt aber auf 1.3, wenn die Modelle um einen plausibel kalibrierter Finanzakzelerator erweitert und der Anteil liquiditätsbeschränkter Haushalte krisenbedingt erhöht wird. Im letzteren Szenario trägt die Haushaltskonsolidierung die maßgebliche Verantwortung für die Verschlechterung der Produktionslücke im Zeitraum 2011 bis 2013. Wäre die Haushaltskonsolidierung erst in einer Phase mit uneingeschränktem geldpolitischen Handlungsspielraum vorgenommen worden (d.h. nach der Erholung der Wirtschaft) hätte der Großteil der BIP-Verluste verhindert werden können
    corecore