11 research outputs found

    Influences of major storm events on backbarrier salt marsh change : Masonboro Island, Southeastern North Carolina

    Get PDF
    The purpose of this study was to utilize a Geographic Information System to investigate how storm events affect backbarrier salt marshes on Masonboro Island, North Carolina. This was performed by examining aerial photographs from 1938 to 2002 that were specifically chosen to capture the condition of the island and marshes before and/or after major storm events. The aerial photographs were manually digitized and categorized based on degree of marsh fragmentation. The area of total marsh, high-, medium-, and low-fragmentation, and total island/sound polygons were calculated for each of the years in the study and compared as percentages. Change detections were performed for each time period and long-term change. Results found that storm overwash did influence marsh change, though most change was not statistically significant. Gains in marsh area occurred with overwashing by infilling open water areas and recolonization of washover deposits. Losses in marsh area were due to erosion by storms and inlet dynamics, burial, and degradation of marsh from lack of inorganic sediments. Overall, the changes that occurred appeared to be the natural responses of backbarrier marshes behind an undeveloped barrier island that is impacted by sea level rise and storm events

    Composition of the early Oligocene ocean from coral stable isotope and elemental chemistry

    Full text link
    A sectioned and polished specimen of the coral Archohelia vicksburgensis from the early Oligocene Byram Formation (∼30 Ma) near Vicksburg, Mississippi, reveals 12 prominent annual growth bands. Stable oxygen isotopic compositions of 77 growth-band-parallel microsamples of original aragonite exhibit well-constrained fluctuations that range between −2.0 and −4.8. Variation in Δ 18 O of coral carbonate reflects seasonal variation in temperature ranging from 12 to 24 °C about a mean of 18 °C. These values are consistent with those derived from a bivalve and a fish otolith from the same unit, each using independently derived palaeotemperature equations. Mg/Ca and Sr/Ca ratios were determined for 40 additional samples spanning five of the 12 annual bands. Palaeotemperatures calculated using elemental-ratio thermometers calibrated on modern corals are consistently lower; mean temperature from Mg/Ca ratios are 12.5 ± 1 °C while those from Sr/Ca are 5.8 ± 2.2 °C. Assuming that Δ 18 O-derived temperatures are correct, relationships between temperature and elemental ratio for corals growing in today's ocean can be used to estimate Oligocene palaeoseawater Mg/Ca and Sr/Ca ratios. Calculations indicate that early Oligocene seawater Mg/Ca was ∼81% (4.2 mol mol −1 ) and Sr/Ca ∼109% (9.9 mmol mol −1 ) of modern values. Oligocene seawater with this degree of Mg depletion and Sr enrichment is in good agreement with that expected during the Palaeogene transition from ‘calcite’ to ‘aragonite’ seas. Lower Oligocene Mg/Ca probably reflects a decrease toward the present day in sea-floor hydrothermal activity and concomitant decrease in scavenging of magnesium from seawater. Elevated Sr/Ca ratio may record lesser amounts of Oligocene aragonite precipitation and a correspondingly lower flux of strontium into the sedimentary carbonate reservoir than today.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/72914/1/j.1472-4677.2004.00025.x.pd

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    The earliest evidence for anatomically modern humans in northwestern Europe

    No full text
    The earliest anatomically modern humans in Europe are thought to have appeared around 43,000–42,000 calendar years before present (43–42 kyr cal BP), by association with Aurignacian sites and lithic assemblages assumed to have been made by modern humans rather than by Neanderthals. However, the actual physical evidence for modern humans is extremely rare, and direct dates reach no farther back than about 41–39 kyr cal BP, leaving a gap. Here we show, using stratigraphic, chronological and archaeological data, that a fragment of human maxilla from the Kent’s Cavern site, UK, dates to the earlier period. The maxilla (KC4), which was excavated in 1927, was initially diagnosed as Upper Palaeolithic modern human1. In 1989, it was directly radiocarbon dated by accelerator mass spectrometry to 36.4–34.7 kyr cal BP2. Using a Bayesian analysis of new ultrafiltered bone collagen dates in an ordered stratigraphic sequence at the site, we show that this date is a considerable underestimate. Instead, KC4 dates to 44.2–41.5 kyr cal BP. This makes it older than any other equivalently dated modern human specimen and directly contemporary with the latest European Neanderthals, thus making its taxonomic attribution crucial. We also show that in 13 dental traits KC4 possesses modern human rather than Neanderthal characteristics; three other traits show Neanderthal affinities and a further seven are ambiguous. KC4 therefore represents the oldest known anatomically modern human fossil in northwestern Europe, fills a key gap between the earliest dated Aurignacian remains and the earliest human skeletal remains, and demonstrates the wide and rapid dispersal of early modern humans across Europe more than 40 kyr ago

    Pre-extinction Demographic Stability and Genomic Signatures of Adaptation in the Woolly Rhinoceros

    Get PDF
    Ancient DNA has significantly improved our understanding of the evolution and population history of extinct megafauna. However, few studies have used complete ancient genomes to examine species responses to climate change prior to extinction. The woolly rhinoceros (Coelodonta antiquitatis) was a cold-adapted megaherbivore widely distributed across northern Eurasia during the Late Pleistocene and became extinct approximately 14 thousand years before present (ka BP). While humans and climate change have been proposed as potential causes of extinction [1-3], knowledge is limited on how the woolly rhinoceros was impacted by human arrival and climatic fluctuations [2]. Here, we use one complete nuclear genome and 14 mitogenomes to investigate the demographic history of woolly rhinoceros leading up to its extinction. Unlike other northern megafauna, the effective population size of woolly rhinoceros likely increased at 29.7 ka BP and subsequently remained stable until close to the species’ extinction. Analysis of the nuclear genome from a similar to 18.5-ka-old specimen did not indicate any increased inbreeding or reduced genetic diversity, suggesting that the population size remained steady for more than 13 ka following the arrival of humans [4]. The population contraction leading to extinction of the woolly rhinoceros may have thus been sudden and mostly driven by rapid warming in the Bolling-Allerod interstadial. Furthermore, we identify woolly rhinoceros-specific adaptations to arctic climate, similar to those of the woolly mammoth. This study highlights how species respond differently to climatic fluctuations and further illustrates the potential of palaeogenomics to study the evolutionary history of extinct species
    corecore