265 research outputs found

    Book Reviews

    Get PDF

    Effects of Salinity Changes on the Photodegradation and Ultraviolet-Visible Absorbance of Terrestrial Dissolved Organic Matter

    Get PDF
    We performed laboratory studies to determine the effects of salinity on the photodegradation of dissolved organic matter (DOM) from the Great Dismal Swamp, Virginia, an important source of terrestrial DOM to the lower Chesapeake Bay. Samples were created by mixing Great Dismal Swamp water (ionic strength approximate to 0 mol L-1) with modified artificial seawater solutions of differing salinities while keeping the final dissolved organic carbon (DOC) concentration constant. These samples were then irradiated for 24 h in a light box providing ultraviolet (UV) light similar to that of natural sunlight. Light absorbance and DOC concentrations decreased after photoexposure, whereas dissolved inorganic carbon (DIC) concentrations increased. Variations in salinity affected both DIC production and UV absorption, with the higher salinity samples showing lower DIC production and less photobleaching. Addition of an iron chelator eliminated the relationship between photochemistry and salinity by reducing both photobleaching and DIC production at low salinities. As terrigenous DOM transits through an estuary, its photochemical reactivity and optical properties may change significantly as a function of salinity, probably as a result of changes in DOM conformation or changes in iron-DOM photochemistry, or both

    The Power of Environmental Observatories for Advancing Multidisciplinary Research, Outreach, and Decision Support: The Case of the Minnesota River Basin

    Get PDF
    An edited version of this paper was published by AGU. Copyright 2019 American Geophysical Union.Observatory‐scale data collection efforts allow unprecedented opportunities for integrative, multidisciplinary investigations in large, complex watersheds, which can affect management decisions and policy. Through the National Science Foundation‐funded REACH (REsilience under Accelerated CHange) project, in collaboration with the Intensively Managed Landscapes‐Critical Zone Observatory, we have collected a series of multidisciplinary data sets throughout the Minnesota River Basin in south‐central Minnesota, USA, a 43,400‐km2 tributary to the Upper Mississippi River. Postglacial incision within the Minnesota River valley created an erosional landscape highly responsive to hydrologic change, allowing for transdisciplinary research into the complex cascade of environmental changes that occur due to hydrology and land use alterations from intensive agricultural management and climate change. Data sets collected include water chemistry and biogeochemical data, geochemical fingerprinting of major sediment sources, high‐resolution monitoring of river bluff erosion, and repeat channel cross‐sectional and bathymetry data following major floods. The data collection efforts led to development of a series of integrative reduced complexity models that provide deeper insight into how water, sediment, and nutrients route and transform through a large channel network and respond to change. These models represent the culmination of efforts to integrate interdisciplinary data sets and science to gain new insights into watershed‐scale processes in order to advance management and decision making. The purpose of this paper is to present a synthesis of the data sets and models, disseminate them to the community for further research, and identify mechanisms used to expand the temporal and spatial extent of short‐term observatory‐scale data collection efforts

    The Power of Environmental Observatories for Advancing Multidisciplinary Research, Outreach, and Decision Support: The Case of the Minnesota River Basin

    Get PDF
    Observatory‐scale data collection efforts allow unprecedented opportunities for integrative, multidisciplinary investigations in large, complex watersheds, which can affect management decisions and policy. Through the National Science Foundation‐funded REACH (REsilience under Accelerated CHange) project, in collaboration with the Intensively Managed Landscapes‐Critical Zone Observatory, we have collected a series of multidisciplinary data sets throughout the Minnesota River Basin in south‐central Minnesota, USA, a 43,400‐km2 tributary to the Upper Mississippi River. Postglacial incision within the Minnesota River valley created an erosional landscape highly responsive to hydrologic change, allowing for transdisciplinary research into the complex cascade of environmental changes that occur due to hydrology and land use alterations from intensive agricultural management and climate change. Data sets collected include water chemistry and biogeochemical data, geochemical fingerprinting of major sediment sources, high‐resolution monitoring of river bluff erosion, and repeat channel cross‐sectional and bathymetry data following major floods. The data collection efforts led to development of a series of integrative reduced complexity models that provide deeper insight into how water, sediment, and nutrients route and transform through a large channel network and respond to change. These models represent the culmination of efforts to integrate interdisciplinary data sets and science to gain new insights into watershed‐scale processes in order to advance management and decision making. The purpose of this paper is to present a synthesis of the data sets and models, disseminate them to the community for further research, and identify mechanisms used to expand the temporal and spatial extent of short‐term observatory‐scale data collection efforts

    End-to-end resource analysis for quantum interior point methods and portfolio optimization

    Full text link
    We study quantum interior point methods (QIPMs) for second-order cone programming (SOCP), guided by the example use case of portfolio optimization (PO). We provide a complete quantum circuit-level description of the algorithm from problem input to problem output, making several improvements to the implementation of the QIPM. We report the number of logical qubits and the quantity/depth of non-Clifford T-gates needed to run the algorithm, including constant factors. The resource counts we find depend on instance-specific parameters, such as the condition number of certain linear systems within the problem. To determine the size of these parameters, we perform numerical simulations of small PO instances, which lead to concrete resource estimates for the PO use case. Our numerical results do not probe large enough instance sizes to make conclusive statements about the asymptotic scaling of the algorithm. However, already at small instance sizes, our analysis suggests that, due primarily to large constant pre-factors, poorly conditioned linear systems, and a fundamental reliance on costly quantum state tomography, fundamental improvements to the QIPM are required for it to lead to practical quantum advantage.Comment: 38 pages, 15 figure

    Grouping of UVCB substances with dose-response transcriptomics data from human cell-based assays

    Get PDF
    The application of in vitro biological assays as new approach methodologies (NAMs) to support grouping of UVCB (unknown or variable composition, complex reaction products, and biological materials) substances has recently been demonstrated. In addition to cell-based phenotyping as NAMs, in vitro transcriptomic profiling is used to gain deeper mechanistic understanding of biological responses to chemicals and to support grouping and read-across. However, the value of gene expression profiling for characterizing complex substances like UVCBs has not been explored. Using 141 petroleum substance extracts, we performed dose-response transcriptomic profiling in human induced pluripotent stem cell (iPSC)-derived hepatocytes, cardiomyocytes, neurons, and endothelial cells, as well as cell lines MCF7 and A375. The goal was to determine whether transcriptomic data can be used to group these UVCBs and to further characterize the molecular basis for in vitro biological responses. We found distinct transcriptional responses for petroleum substances by manufacturing class. Pathway enrichment informed interpretation of effects of substances and UVCB petroleum-class. Transcriptional activity was strongly correlated with concentration of polycyclic aromatic compounds (PAC), especially in iPSC-derived hepatocytes. Supervised analysis using transcriptomics, alone or in combination with bioactivity data collected on these same substances/cells, suggest that transcriptomics data provide useful mechanistic information, but only modest additional value for grouping. Overall, these results further demonstrate the value of NAMs for grouping of UVCBs, identify informative cell lines, and provide data that could be used for justifying selection of substances for further testing that may be required for registration

    Detecting facet joint and lateral mass injuries of the subaxial cervical spine in major trauma patients

    Get PDF
    Study Design: Radiologic imaging measurement study. Purpose: To assess the accuracy of detecting lateral mass and facet joint injuries of the subaxial cervical spine on plain radiographs using computed tomography (CT) scan images as a reference standard; and the integrity of morphological landmarks of the lateral mass and facet joints of the subaxial cervical spine. Overview of Literature: Injuries of lateral mass and facet joints potentially lead to an unstable subaxial cervical spine and concomitant neurological sequelae. However, no study has evaluated the accuracy of detecting specific facet joint injuries. Methods: Eight spinal surgeons scored four sets of the same, randomly re-ordered, 30 cases with and without facet joint injuries of the subaxial cervical spine. Two surveys included conventional plain radiographs series (test) and another two surveys included CT scan images (reference). Facet joint injury characteristics were assessed for accuracy and reliability. Raw agreement, Fleiss kappa, Cohen's kappa and intraclass correlation coefficient statistics were used for reliability analysis. Majority rules were used for accuracy analysis. Results: Of the 21 facet joint injuries discerned on CT scan images, 10 were detected in both plain radiograph surveys (sensitivity, 0.48; 95% confidence interval [CI], 0.26-0.70). There were no false positive facet joint injuries in either of the first two X-ray surveys (specificity, 1.0; 95% CI, 0.63-1.0). Five of the 11 cases with missed injuries had an injury below the lowest visible articulating level on radiographs. CT scan images resulted in superior inter- and intra-rater agreement values for assessing morphologic injury characteristics of facet joint injuries. Conclusions: Plain radiographs are not accurate, nor reliable for the assessment of facet joint injuries of the subaxial cervical spine. CT scans offer reliable diagnostic information required for the detection and treatment planning of facet joint injuries.Joost Johannes van Middendorp, Ian Cheung, Kristian Dalzell, Hamish Deverall, Brian J.C. Freeman, Stephen A.C. Morris, Simon J.I. Sandler, Richard Williams, Y.H. Yau, Ben Gos

    Animal welfare impacts of badger culling operations

    Get PDF
    We are writing to express our extreme concern following recent media coverage1, 2 relating to the methodology being used by contractors to kill badgers under licence, as part of the government’s policy to control bovine TB in cattle. The coverage relates to the shooting of badgers that have been captured in live traps. Covert video footage (https://bit.ly/2Eud1iR ) from Cumbria shows a trapped badger being shot with a firearm at close range, following which it appears to take close to a minute to stop moving. The contractor clearly observes the animal during this time but makes no attempt to expedite the death of the badger and prevent further suffering, as required by the current Natural England best practice guide which states: ‘Immediately after shooting, the animal should be checked to ensure it is dead, and if there is any doubt, a second shot must be taken as soon as possible.’3 The conversation between the contractor and his companion also suggests they were considering moving the badger to another site before finally bagging the carcase, again breaching the best practice guide. While the footage only relates to the experience of a single badger, and while the degree to which the badger was conscious in the period immediately following the shot is unclear, we can by no means be certain that the badger did not suffer. It also raises serious questions about the training, competence and behaviour of contractors, in relation to both badger welfare, and biosecurity. This adds to existing concerns relating to the humaneness of ‘controlled shooting’ (targeting free-roaming badgers with rifles), which continues to be a permitted method under culling licences, in spite of the reservations expressed by both the government-commissioned Independent Expert Panel in its 2014 report,4 and the BVA, which concluded in 2015 that it ‘can no longer support the continued use of controlled shooting as part of the badger control policy’.5 (However, it has since continued to support the issuing of licences which permit the method). The BVA has consistently indicated its support for what it calls the ‘tried and tested’ method of trapping and shooting, but has thus far failed to provide comprehensive and robust evidence for the humaneness of this method. Figure1 Download figure Open in new tab Download powerpoint Natural England reported that its monitors observed 74 (just over 0.6 per cent) of controlled shooting events for accuracy and humaneness During 2017, almost 20,000 badgers were killed under licence across 19 cull zones, around 60 per cent of which were killed by controlled shooting, the remainder being trapped and shot.6 Natural England reported that its monitors observed 74 (just over 0.6 per cent) of controlled shooting events for accuracy and humaneness. No information has been provided on the extent to which trapping and shooting activities were monitored. This raises serious concerns about the extent of suffering that might be experienced by very large numbers of animals, for which contractors are not being held to account. If contractors reach their maximum culling targets set by Natural England for 2018, as many as 41,000 additional badgers could be killed.7 The extent to which these animals will suffer is once again being left in the hands of contractors, with woefully inadequate oversight, and in the face of anecdotal evidence of breaches of best practice guidance. This situation is clearly unacceptable from an animal welfare perspective and it is our view that by endorsing the policy, the BVA is contradicting the principles contained within its own animal welfare strategy.8 We therefore urge the BVA to withdraw its support for any further licensed badger culling, and the RCVS to make it clear that any veterinarian who provides support for culling activities that result in unnecessary and avoidable animal suffering could face disciplinary proceedings. The veterinary profession has no business supporting this licensed mass killing with all its inherent negative welfare and biosecurity implications, and for which the disease control benefits are, at best, extremely uncertain. We believe the continued support for the culls by veterinary bodies in the face of poor evidence for its efficacy damages the credibility of the profession, and that same support in the face of potential animal suffering on a large scale undermines its reputation. We stand ready to discuss these issues in more detail

    Geomagnetically Induced Currents and Harmonic Distortion: High time Resolution Case Studies

    Get PDF
    High time resolution (1‐5 s) magnetometer, geomagnetically induced current (GIC), and mains harmonic distortion data from the Halfway Bush substation in Dunedin, New Zealand are analyzed. A recently developed technique using VLF radio wave data provides high resolution measurements of mains harmonic distortion levels. Three case studies are investigated, each involving high rates of change of local geomagnetic field, but with different timescales of magnetospheric driver mechanisms, and different substation transformer configurations. Two cases of enhanced GIC during substorm events are analyzed, and one case of a storm sudden commencement. Time delays between magnetic field fluctuations and induced transformer currents are found to be ~100 s for substorm events, but only ~20 s for the storm sudden commencement containing higher frequency variations. Boxcar averaging of the magnetic field fluctuations using running windows of ± 2 minutes leads to spectral power profiles similar to those of GIC profiles, with reduced power at frequencies >0.003 Hz (periods 5 minutes). This low frequency component of the magnetic field power spectrum appears necessary for mains harmonic distortion to occur

    Accrediting outputs of noisy intermediate-scale quantum computing devices

    Get PDF
    We present an accreditation protocol for the outputs of noisy intermediate-scale quantum devices. By testing entire circuits rather than individual gates, our accreditation protocol can provide an upper-bound on the variation distance between noisy and noiseless probability distribution of the outputs of the target circuit of interest. Our accreditation protocol requires implementation of quantum circuits no larger than the target circuit, therefore it is practical in the near term and scalable in the long term. Inspired by trap-based protocols for the verification of quantum computations, our accreditation protocol assumes that noise in single-qubit gates is bounded (but potentially gate-dependent) in diamond norm. We allow for arbitrary spatial and temporal correlations in the noise affecting state preparation, measurements and two-qubit gates. We describe how to implement our protocol on real-world devices, and we also present a novel cryptographic protocol (which we call `mesothetic' protocol) inspired by our accreditation protocol.Comment: Accepted versio
    • 

    corecore