16 research outputs found

    Coastal sedimentation across North America doubled in the 20th century despite river dams

    Get PDF
    The proliferation of dams since 1950 promoted sediment deposition in reservoirs, which is thought to be starving the coast of sediment and decreasing the resilience of communities to storms and sea-level rise. Diminished river loads measured upstream from the coast, however, should not be assumed to propagate seaward. Here, we show that century-long records of sediment mass accumulation rates (g cm−2 yr−1) and sediment accumulation rates (cm yr−1) more than doubled after 1950 in coastal depocenters around North America. Sediment sources downstream of dams compensate for the river-sediment lost to impoundments. Sediment is accumulating in coastal depocenters at a rate that matches or exceeds relative sea-level rise, apart from rapidly subsiding Texas and Louisiana where water depths are increasing and intertidal areas are disappearing. Assuming no feedbacks, accelerating global sea-level rise will eventually surpass current sediment accumulation rates, underscoring the need for including coastal-sediment management in habitat-restoration projects

    Anthropogenic impacts on tidal creek sedimentation since 1900

    Get PDF
    Land cover and use around the margins of estuaries has shifted since 1950 at many sites in North America due to development pressures from higher population densities. Small coastal watersheds are ubiquitous along estuarine margins and most of this coastal land-cover change occurred in these tidal creek watersheds. A change in land cover could modify the contribution of sediments from tidal creek watersheds to downstream areas and affect estuarine habitats that rely on sediments to persist or are adversely impacted by sediment loading. The resilience of wetlands to accelerating relative sea-level rise depends, in part, on the supply of lithogenic sediment to support accretion and maintain elevation; however, subtidal habitats such as oyster reefs and seagrass beds are stressed under conditions of high turbidity and sedimentation. Here we compare sediment accumulation rates before and after 1950 using 210Pb in 12 tidal creeks across two distinct regions in North Carolina, one region of low relief tidal-creek watersheds where land cover change since 1959 was dominated by fluctuations in forest, silviculture, and agriculture, and another region of relatively high relief tidal-creek watersheds where land-use change was dominated by increasing suburban development. At eight of the creeks, mass accumulation rates (g cm-2 y-1) measured at the outlet of the creeks increased contemporaneously with the largest shift in land cover, within the resolution of the land-cover data set (~5-years). All but two creek sites experienced a doubling or more in sediment accumulation rates (cm yr-1) after 1950 and most sites experienced sediment accumulation rates that exceeded the rate of local relative sea-level rise, suggesting that there is an excess of sediment being delivered to these tidal creeks and that they may slowly be infilling. After 1950, land cover within one creek watershed changed little, as did mass accumulation rates at the coring location, and another creek coring site did not record an increase in mass accumulation rates at the creek outlet despite a massive increase in development in the watershed that included the construction of retention ponds. These abundant tidal-creek watersheds have little relief, area, and flow, but they are impacted by changes in land cover more, in terms of percent area, than their larger riverine counterparts, and down-stream areas are highly connected to their associated watersheds. This work expands the scientific understanding of connectivity between lower coastal plain watersheds and estuaries and provides important information for coastal zone managers seeking to balance development pressures and environmental protections

    Gigahertz optoacoustic imaging for cellular imaging

    No full text
    Photoacoustic imaging exploits contrast mechanisms that depend on optical and thermomechanical properties of optical absorbers. The photoacoustic signal bandwidth is dictated by the absorber size and the laser pulse width. In this work we demonstrate that photoacoustic signals can be detected from micron and sub-micron particles. We anticipate applications to include cellular imaging with nanometer sized contrast agents such as gold nanoshells, nanorods, and nanocages. An existing acoustic microscopy system was used (the SASAM 1000, kibero GmbH). This platform is developed on an Olympus IX81 optical microscope with a rotating column that has an optical condenser for transmission optical microscopy and an acoustic module for the acoustic microscopy. The adapted optoacoustic module consists of a Qswitched Nd:YAG solid-state-laser (Teem Photonics, France) generating sub-nanosecond pulses. Scans were acquired of microparticles (1 ”m black Toner particles) and cells. The confocal arrangement allowed high signal to noise ratio photoacoustic signals (>30 dB) to be detected at approximately 400 MHz. The particles of various sizes produced signals of different frequency content. In imaging mode, the full width half maximum (FWHM) was measured to be 3.6 ”m for the 400 MHz transducer which is in general agreement theory for a 0.3 NA objective (4.3”m). Moreover, images are generated from single melanoma cells, generated by the endogenous contrast from the intracellular melanin

    Testing the ability of the ExoMars 2018 pay- load to document geological context and potential habitability on Mars

    Get PDF
    International audienceThe future ExoMars rover mission (ESA/Roscosmos), to be launched in 2018, will investigate the habitability of the Martian surface and near subsurface, and search for traces of past life in the form of textural biosignatures and organic molecules. In support of this mission, a selection of relevant Mars analogue materials has been characterised and stored in the International Space Analogue Rockstore (ISAR), hosted in OrlĂ©ans, France. Two ISAR samples were analysed by prototypes of the ExoMars rover instruments used for petrographic study. The objective was to determine whether a full interpretation of the rocks could be achieved on the basis of the data obtained by the ExoMars visible-IR imager and spectrometer (MicrOmega), the close-up imager (CLUPI), the drill infrared spectrometer (Ma_Miss) and the Raman spectrometer (RLS), first separately then in their entirety. In order to not influence the initial instrumental interpretation, the samples were sent to the different teams without any additional information. This first step was called the “Blind Test” phase. The data obtained by the instruments were then complemented with photography of the relevant outcrops (as would be available during the ExoMars mission) before being presented to two geologists tasked with the interpretation. The context data and photography of the outcrops and of the samples were sufficient for the geologists to identify the rocks. This initial identification was crucial for the subsequent, iterative interpretation of the spectroscopic data. The data from the different spectrometers was, thus, cross-calibrated against the photographic interpretations and against each other. In this way, important mineralogical details, such as evidence of aqueous alteration of the rocks, provided relevant information concerning potential habitable conditions. The final conclusion from this test is that, when processed together, the ExoMars payload instruments produce complementary data allowing reliable interpretation of the geological context and potential for habitable environments. This background information is fundamental for the analysis and interpretation of organics in the processed Martian rocks

    Towards Modeling and Model Checking Fault-Tolerant Distributed Algorithms

    No full text
    Abstract. Fault-tolerant distributed algorithms are central for building reliable, spatially distributed systems. In order to ensure that these algorithms actually make systems more reliable, we must ensure that these algorithms are actually correct. Unfortunately, model checking state-ofthe-art fault-tolerant distributed algorithms (such as Paxos) is currently out of reach except for very small systems. In order to be eventually able to automatically verify such fault-tolerant distributed algorithms also in larger systems, several problems have to be addressed. In this paper, we consider modeling and verification of fault-tolerant algorithms that basically only contain threshold guards to control the flow of the algorithm. As threshold guards are widely used in fault-tolerant distributed algorithms (and also in Paxos) efficient methods to handle them bring us closer to the above mentioned goal. As case study we use the reliable broadcasting algorithm by Srikanth and Toueg that tolerates even Byzantine faults. We show how one can model this basic fault-tolerant distributed algorithm in Promela such that safety and liveness properties can be efficiently verified in Spin. We provide experimental data also for other distributed algorithms.

    Selection in backcross programmes

    No full text
    Backcrossing is a well-known and long established breeding scheme where a characteristic is introgressed from a donor parent into the genomic background of a recurrent parent. The various uses of backcrossing in modern genetics, particularly with the help of molecular markers, are reviewed here. Selection in backcross programmes is used to either improve the genetic value of plant and animal populations or fine map quantitative trait loci. Both cases are helpful in our understanding of the genetic bases of quantitative traits variation
    corecore