744 research outputs found
Examination of change factor methodologies for climate change impact assessment
Citation: Anandhi, A., Frei, A., Pierson, D. C., Schneiderman, E. M., Zion, M. S., Lounsbury, D., and Matonse, A. H. ( 2011), Examination of change factor methodologies for climate change impact assessment, Water Resour. Res., 47, W03501, doi:10.1029/2010WR009104.A variety of methods are available to estimate values of meteorological variables at future times and at spatial scales that are appropriate for local climate change impact assessment. One commonly used method is Change Factor Methodology (CFM), sometimes referred to as delta change factor methodology. Although more sophisticated methods exist, CFM is still widely applicable and used in impact analysis studies. While there are a number of different ways by which change factors (CFs) can be calculated and used to estimate future climate scenarios, there are no clear guidelines available in the literature to decide which methodologies are most suitable for different applications. In this study several categories of CFM (additive versus multiplicative and single versus multiple) for a number of climate variables are compared and contrasted. The study employs several theoretical case studies, as well as a real example from Cannonsville watershed, which supplies water to New York City, USA. Results show that in cases when the frequency distribution of Global Climate Model (GCM) baseline climate is close to the frequency distribution of observed climate, or when the frequency distribution of GCM future climate is close to the frequency distribution of GCM baseline climate, additive and multiplicative single CFMs provide comparable results. Two options to guide the choice of CFM are suggested. The first option is a detailed methodological analysis for choosing the most appropriate CFM. The second option is a default method for use under circumstances in which a detailed methodological analysis is too cumbersome
Recommended from our members
Molybdenum evidence for expansive sulfidic water masses in ~750Ma oceans
The Ediacaran appearance of large animals, including motile bilaterians, is commonly hypothesized to reflect a physiologically enabling increase in atmospheric and oceanic oxygen abundances (pO2). To date, direct evidence for low oxygen in pre-Ediacaran oceans has focused on chemical signatures in the rock record that reflect conditions in local basins, but this approach is both biased to constrain only shallower basins and statistically limited when we seek to follow the evolution of mean ocean chemical state through time. Because the abundance and isotopic composition of molybdenum (Mo) in organic-rich euxinic sediments can vary in response to changes in global redox conditions, Mo geochemistry provides independent constraints on the global evolution of well-oxygenated environments. Here, we establish a theoretical framework to access global marine Mo cycle in the past from the abundance and isotope composition of ancient seawater. Further, we investigate the ~ 750 Ma Walcott Member of the Chuar Group, Grand Canyon, which accumulated in a rift basin with open connection to the ocean. Iron speciation data from upper Walcott shales indicate that local bottom waters were anoxic and sulfidic, consistent with their high organic content (up to 20 wt.%). Similar facies in Phanerozoic successions contain high concentrations of redox-sensitive metals, but in the Walcott Member, abundances of Mo and U, as well as Mo/TOC (~ 0.5 ppm/wt.%) are low. δ98Mo values also fall well below modern equivalents (0.99 ± 0.13‰ versus ~ 2.35‰ today). These signatures are consistent with model predictions where sulfidic waters cover ~ 1–4% of the global seafloor, corresponding to a ~ 20–80 fold increase compared to the modern ocean. Therefore, our results suggest globally expansive sulfidic water masses in mid-Neoproterozoic oceans, bridging a nearly 700 million-year gap in previous Mo data. We propose that anoxic and sulfidic (euxinic) conditions governed Mo cycling in the oceans even as ferruginous subsurface waters re-appeared 800–750 Ma, and we interpret this anoxic ocean state to reflect a markedly lower atmospheric and oceanic O2 level, consistent with the hypothesis that pO2 acted as an evolutionary barrier to the emergence of large motile bilaterian animals prior to the Ediacaran Period.Organismic and Evolutionary Biolog
FEMA: fast and efficient mixed-effects algorithm for large sample whole-brain imaging data
The linear mixed-effects model (LME) is a versatile approach to account for dependence among observations. Many large-scale neuroimaging datasets with complex designs have increased the need for LME; however LME has seldom been used in whole-brain imaging analyses due to its heavy computational requirements. In this paper, we introduce a fast and efficient mixed-effects algorithm (FEMA) that makes whole-brain vertex-wise, voxel-wise, and connectome-wide LME analyses in large samples possible. We validate FEMA with extensive simulations, showing that the estimates of the fixed effects are equivalent to standard maximum likelihood estimates but obtained with orders of magnitude improvement in computational speed. We demonstrate the applicability of FEMA by studying the cross-sectional and longitudinal effects of age on region-of-interest level and vertex-wise cortical thickness, as well as connectome-wide functional connectivity values derived from resting state functional MRI, using longitudinal imaging data from the Adolescent Brain Cognitive DevelopmentSM Study release 4.0. Our analyses reveal distinct spatial patterns for the annualized changes in vertex-wise cortical thickness and connectome-wide connectivity values in early adolescence, highlighting a critical time of brain maturation. The simulations and application to real data show that FEMA enables advanced investigation of the relationships between large numbers of neuroimaging metrics and variables of interest while considering complex study designs, including repeated measures and family structures, in a fast and efficient manner. The source code for FEMA is available via: https://github.com/cmig-research-group/cmig_tools/
In search of the authentic nation: landscape and national identity in Canada and Switzerland
While the study of nationalism and national identity has flourished in the last decade, little attention has been devoted to the conditions under which natural environments acquire significance in definitions of nationhood. This article examines the identity-forming role of landscape depictions in two polyethnic nation-states: Canada and Switzerland. Two types of geographical national identity are identified. The first – what we call the ‘nationalisation of nature’– portrays zarticular landscapes as expressions of national authenticity. The second pattern – what we refer to as the ‘naturalisation of the nation’– rests upon a notion of geographical determinism that depicts specific landscapes as forces capable of determining national identity. The authors offer two reasons why the second pattern came to prevail in the cases under consideration: (1) the affinity between wild landscape and the Romantic ideal of pure, rugged nature, and (2) a divergence between the nationalist ideal of ethnic homogeneity and the polyethnic composition of the two societies under consideration
Variation in antibiotic treatment for diabetic patients with serious foot infections: A retrospective observational study
<p>Abstract</p> <p>Background</p> <p>Diabetic foot infections are common, serious, and diverse. There is uncertainty about optimal antibiotic treatment, and probably substantial variation in practice. Our aim was to document whether this is the case: A finding that would raise questions about the comparative cost-effectiveness of different regimens and also open the possibility of examining costs and outcomes to determine which should be preferred.</p> <p>Methods</p> <p>We used the Veterans Health Administration (VA) Diabetes Epidemiology Cohorts (DEpiC) database to conduct a retrospective observational study of hospitalized patients with diabetic foot infections. DEpiC contains computerized VA and Medicare patient-level data for VA patients with diabetes since 1998, including demographics, ICD-9-CM diagnostic codes, antibiotics prescribed, and VA facility. We identified all patients with ICD-9-CM codes for cellulitis/abscess of the foot and then sub-grouped them according to whether they had cellulitis/abscess plus codes for gangrene, osteomyelitis, skin ulcer, or none of these. For each facility, we determined: 1) The proportion of patients treated with an antibiotic and the initial route of administration; 2) The first antibiotic regimen prescribed for each patient, defined as treatment with the same antibiotic, or combination of antibiotics, for at least 5 continuous days; and 3) The antibacterial spectrum of the first regimen.</p> <p>Results</p> <p>We identified 3,792 patients with cellulitis/abscess of the foot either alone (16.4%), or with ulcer (32.6%), osteomyelitis (19.0%) or gangrene (32.0%). Antibiotics were prescribed for 98.9%. At least 5 continuous days of treatment with an unchanged regimen of one or more antibiotics was prescribed for 59.3%. The means and (ranges) across facilities of the three most common regimens were: 16.4%, (22.8%); 15.7%, (36.1%); and 10.8%, (50.5%). The range of variation across facilities proved substantially greater than that across the different categories of foot infection. We found similar variation in the spectrum of the antibiotic regimen.</p> <p>Conclusions</p> <p>The large variations in regimen appear to reflect differences in facility practice styles rather than case mix. It is unlikely that all regimens are equally cost-effective. Our methods make possible evaluation of many regimens across many facilities, and can be applied in further studies to determine which antibiotic regimens should be preferred.</p
- …