35 research outputs found

    Modern space/time geostatistics using river distances: theory and applications for water quality mapping

    Get PDF
    The Clean Water Act requires that state and local agencies assess all river miles for potential impairments. However, due to the large number of river miles to be assessed, as well as budget and resource limitations, many states cannot feasibly meet this requirement. Therefore, there is a need for a framework that can accurately assess water quality at un-monitored locations, using limited data resources. Many researchers employ geostatistical techniques such as kriging and Bayesian Maximum Entropy (BME) to interpolate values in areas where no data exist. These techniques rely on the spatial and/or temporal autocorrelation between existing data points to estimate at un-monitored locations. This autocorrelation is traditionally a function of the Euclidean distance between those data points; however, a Euclidean distance does not take into account that many water quality variables may be spatially correlated due to the hydrogeography of the system. The focus of this work is the development of a space/time geostatistical framework for estimating and mapping water quality along river networks by using river distances instead of the traditional Euclidean distance. The Bayesian Maximum Entropy method of modern space/time geostatistics is modified and extended to incorporate the use of river distances to improve the estimation of basin-wide water quality. This new framework, termed river-BME, uses geostatistical models that integrate the use of permissible covariance functions with secondary information along with river distance. Factors, such as network complexity, are explored to determine the efficacy of using river-BME for water quality estimation. Additionally, simulation experiments and three real world case studies provide a broad application of this framework for a variety of basins and water quality parameters, including dissolved oxygen, Escherichia coli, and fish tissue mercury. Results show that the use of river-BME produces significantly more accurate estimates of water quality at un-monitored locations than traditional Euclidean based methods by more than 30%. Overall, this work provides a new tool for applying modern space/time geostatistics using river distances. It has the potential to aid not only future researchers but can ultimately provide environmental managers with the information necessary to better allocate resources and protect ecological and human health

    Modern Space/Time Geostatistics Using River Distances: Data Integration of Turbidity and E. coli Measurements to Assess Fecal Contamination Along the Raritan River in New Jersey

    Get PDF
    Escherichia coli (E.coli) is a widely used indicator of fecal contamination in water bodies. External contact and subsequent ingestion of bacteria coming from fecal contamination can lead to harmful health effects. Since E.coli data are sometimes limited, the objective of this study is to use secondary information in the form of turbidity to improve the assessment of E.coli at un-monitored locations. We obtained all E.coli and turbidity monitoring data available from existing monitoring networks for the 2000 – 2006 time period for the Raritan River Basin, New Jersey. Using collocated measurements we developed a predictive model of E.coli from turbidity data. Using this model, soft data are constructed for E.coli given turbidity measurements at 739 space/time locations where only turbidity was measured. Finally, the Bayesian Maximum Entropy (BME) method of modern space/time geostatistics was used for the data integration of monitored and predicted E.coli data to produce maps showing E.coli concentration estimated daily across the river basin. The addition of soft data in conjunction with the use of river distances reduced estimation error by about 30%. Furthermore, based on these maps, up to 35% of river miles in the Raritan Basin had a probability of E.coli impairment greater than 90% on the most polluted day of the study period

    Rhesus monkey rhadinovirus (RRV): construction of a RRV-GFP recombinant virus and development of assays to assess viral replication

    Get PDF
    Rhesus monkey rhadinovirus (RRV) is a gamma-2-herpesvirus that is closely related to Kaposi's sarcoma-associated herpesvirus (KSHV/HHV-8). Lack of an efficient culture system to grow high titers of virus, and the lack of an in vivo animal model system, has hampered the study of KSHV replication and pathogenesis. RRV is capable of replicating to high titers on fibroblasts, thus facilitating the construction of recombinant rhadinoviruses. In addition, the ability to experimentally infect naĂŻve rhesus macaques with RRV makes it an excellent model system to study gamma-herpesvirus replication. Our study describes, for the first time, the construction of a GFP-expressing RRV recombinant virus using a traditional homologous recombination strategy. We have also developed two new methods for determining viral titers of RRV including a traditional viral plaque assay and a quantitative real-time PCR assay. We have compared the replication of wild-type RRV with that of the RRV-GFP recombinant virus in one-step growth curves. We have also measured the sensitivity of RRV to a small panel of antiviral drugs. The development of both the recombination strategy and the viral quantitation assays for RRV will lay the foundation for future studies to evaluate the contribution of individual genes to viral replication both in vitro and in vivo

    Space/Time Analysis of Fecal Pollution and Rainfall in an Eastern North Carolina Estuary

    Get PDF
    The Newport River Estuary (NPRE) is a high priority shellfish harvesting area in eastern North Carolina (NC) that is impaired due to fecal contamination, specifically exceeding recommended levels for fecal coliforms. A hydrologic-driven mean trend model was developed, as a function of antecedent rainfall, in the NPRE to predict levels of E. coli (EC, measured as a proxy for fecal coliforms). This mean trend model was integrated in a Bayesian Maximum Entropy (BME) framework to produce informative Space/Time (S/T) maps depicting fecal contamination across the NPRE during winter and summer months. These maps showed that during dry winter months, corresponding to the oyster harvesting season in NC (October 1st to March 30th), predicted EC concentrations were below the shellfish harvesting standard (14 MPN per 100 ml). However, after substantial rainfall 3.81 cm (1.5 inches), the NPRE did not appear to meet this requirement. Warmer months resulted in the predicted EC concentrations exceeding the threshold for the NPRE. Predicted ENT concentrations were generally below the recreational water quality threshold (104 MPN per 100 ml), except for warmer months after substantial rainfall. Once established, this combined approach produces near real-time visual information on which to base water quality management decisions

    Too Big to Fail — U.S. Banks’ Regulatory Alchemy: Converting an Obscure Agency Footnote into an “At Will” Nullification of Dodd-Frank’s Regulation of the Multi-Trillion Dollar Financial Swaps Market

    Get PDF
    The multi-trillion-dollar market for, what was at that time wholly unregulated, over-the-counter derivatives (“swaps”) is widely viewed as a principal cause of the 2008 worldwide financial meltdown. The Dodd-Frank Act, signed into law on July 21, 2010, was expressly considered by Congress to be a remedy for this troublesome deregulatory problem. The legislation required the swaps market to comply with a host of business conduct and anti-competitive protections, including that the swaps market be fully transparent to U.S. financial regulators, collateralized, and capitalized. The statute also expressly provides that it would cover foreign subsidiaries of big U.S. financial institutions if their swaps trading could adversely impact the U.S. economy or represent the use of extraterritorial trades as an attempt to “evade” Dodd-Frank. In July 2013, the CFTC promulgated an 80-page, triple-columned, and single-spaced “guidance” implementing Dodd-Frank’s extraterritorial reach, i.e., that manner in which Dodd-Frank would apply to swaps transactions executed outside the United States. The key point of that guidance was that swaps trading within the “guaranteed” foreign subsidiaries of U.S. bank holding company swaps dealers were subject to all of Dodd-Frank’s swaps regulations wherever in the world those subsidiaries’ swaps were executed. At that time, the standardized industry swaps agreement contemplated that, inter alia, U.S. bank holding company swaps dealers’ foreign subsidiaries would be “guaranteed” by their corporate parent, as was true since 1992. In August 2013, without notifying the CFTC, the principal U.S. bank holding company swaps dealer trade association privately circulated to its members standard contractual language that would, for the first time, “deguarantee” their foreign subsidiaries. By relying only on the obscure footnote 563 of the CFTC guidance’s 662 footnotes, the trade association assured its swaps dealer members that the newly deguaranteed foreign subsidiaries could (if they so chose) no longer be subject to Dodd-Frank. As a result, it has been reported (and it also has been understood by many experts within the swaps industry) that a substantial portion of the U.S. swaps market has shifted from the large U.S. bank holding companies swaps dealers and their U.S. affiliates to their newly deguaranteed “foreign” subsidiaries, with the attendant claim by these huge big U.S. bank swaps dealers that Dodd-Frank swaps regulation would not apply to these transactions. The CFTC also soon discovered that these huge U.S. bank holding company swaps dealers were “arranging, negotiating, and executing” (“ANE”) these swaps in the United States with U.S. bank personnel and, only after execution in the U.S., were these swaps formally “assigned” to the U.S. banks’ newly “deguaranteed” foreign subsidiaries with the accompanying claim that these swaps, even though executed in the U.S., were not covered by Dodd-Frank. In October 2016, the CFTC proposed a rule that would have closed the “deguarantee” and “ANE” loopholes completely. However, because it usually takes at least a year to finalize a “proposed” rule, this proposed rule closing the loopholes in question was not finalized prior to the inauguration of President Trump. All indications are that it will never be finalized during a Trump Administration. Thus, in the shadow of the recent tenth anniversary of the Lehman failure, there is an understanding among many market regulators and swaps trading experts that large portions of the swaps market have moved from U.S. bank holding company swaps dealers and their U.S. affiliates to their newly deguaranteed foreign affiliates where Dodd- Frank swaps regulation is not being followed. However, what has not moved abroad is the very real obligation of the lender of last resort to rescue these U.S. swaps dealer bank holding companies if they fail because of poorly regulated swaps in their deguaranteed foreign subsidiaries, i.e., the U.S. taxpayer. While relief is unlikely to be forthcoming from the Trump Administration or the Republican-controlled Senate, some other means will have to be found to avert another multi-trillion-dollar bank bailout and/or a financial calamity caused by poorly regulated swaps on the books of big U.S. banks. This paper notes that the relevant statutory framework affords state attorneys general and state financial regulators the right to bring so-called “parens patriae” actions in federal district court to enforce, inter alia, Dodd- Frank on behalf of a state’s citizens. That kind of litigation to enforce the statute’s extraterritorial provisions is now badly needed

    Testing a global standard for quantifying species recovery and assessing conservation impact.

    Get PDF
    Recognizing the imperative to evaluate species recovery and conservation impact, in 2012 the International Union for Conservation of Nature (IUCN) called for development of a "Green List of Species" (now the IUCN Green Status of Species). A draft Green Status framework for assessing species' progress toward recovery, published in 2018, proposed 2 separate but interlinked components: a standardized method (i.e., measurement against benchmarks of species' viability, functionality, and preimpact distribution) to determine current species recovery status (herein species recovery score) and application of that method to estimate past and potential future impacts of conservation based on 4 metrics (conservation legacy, conservation dependence, conservation gain, and recovery potential). We tested the framework with 181 species representing diverse taxa, life histories, biomes, and IUCN Red List categories (extinction risk). Based on the observed distribution of species' recovery scores, we propose the following species recovery categories: fully recovered, slightly depleted, moderately depleted, largely depleted, critically depleted, extinct in the wild, and indeterminate. Fifty-nine percent of tested species were considered largely or critically depleted. Although there was a negative relationship between extinction risk and species recovery score, variation was considerable. Some species in lower risk categories were assessed as farther from recovery than those at higher risk. This emphasizes that species recovery is conceptually different from extinction risk and reinforces the utility of the IUCN Green Status of Species to more fully understand species conservation status. Although extinction risk did not predict conservation legacy, conservation dependence, or conservation gain, it was positively correlated with recovery potential. Only 1.7% of tested species were categorized as zero across all 4 of these conservation impact metrics, indicating that conservation has, or will, play a role in improving or maintaining species status for the vast majority of these species. Based on our results, we devised an updated assessment framework that introduces the option of using a dynamic baseline to assess future impacts of conservation over the short term to avoid misleading results which were generated in a small number of cases, and redefines short term as 10 years to better align with conservation planning. These changes are reflected in the IUCN Green Status of Species Standard

    Modeling Approaches for Characterizing and Evaluating Environmental Exposure to Engineered Nanomaterials in Support of Risk-Based Decision Making

    No full text
    As the use of engineered nanomaterials becomes more prevalent, the likelihood of unintended exposure to these materials also increases. Given the current scarcity of experimental data regarding fate, transport, and bioavailability, determining potential environmental exposure to these materials requires an in depth analysis of modeling techniques that can be used in both the near- and long-term. Here, we provide a critical review of traditional and emerging exposure modeling approaches to highlight the challenges that scientists and decision-makers face when developing environmental exposure and risk assessments for nanomaterials. We find that accounting for nanospecific properties, overcoming data gaps, realizing model limitations, and handling uncertainty are key to developing informative and reliable environmental exposure and risk assessments for engineered nanomaterials. We find methods suited to recognizing and addressing significant uncertainty to be most appropriate for near-term environmental exposure modeling, given the current state of information and the current insufficiency of established deterministic models to address environmental exposure to engineered nanomaterials

    Association of Models of Care for Kawasaki Disease With Utilization and Cardiac Outcomes

    No full text
    OBJECTIVES: Describe the prevalence of different care models for children with Kawasaki disease (KD) and evaluate utilization and cardiac outcomes by care model. METHODS: Multicenter, retrospective cohort study of children aged 0 to 18 hospitalized with KD in US children\u27s hospitals from 2017 to 2018. We classified hospital model of care via survey: hospitalist primary service with as-needed consultation (Model 1), hospitalist primary service with automatic consultation (Model 2), or subspecialist primary service (Model 3). Additional data sources included administrative data from the Pediatric Health Information System database supplemented by a 6-site chart review. Utilization outcomes included laboratory, medication and imaging usage, length of stay, and readmission rates. We measured the frequency of coronary artery aneurysms (CAAs) in the full cohort and new CAAs within 12 weeks in the 6-site chart review subset. RESULTS: We included 2080 children from 44 children\u27s hospitals; 21 hospitals (48%) identified as Model 1, 19 (43%) as Model 2, and 4 (9%) as Model 3. Model 1 institutions obtained more laboratory tests and had lower overall costs (P \u3c .001), whereas echocardiogram (P \u3c .001) and immune modulator use (P \u3c .001) were more frequent in Model 3. Secondary outcomes, including length of stay, readmission rates, emergency department revisits, CAA frequency, receipt of anticoagulation, and postdischarge CAA development, did not differ among models. CONCLUSIONS: Modest cost and utilization differences exist among different models of care for KD without significant differences in outcomes. Further research is needed to investigate primary service and consultation practices for KD to optimize health care value and outcomes
    corecore