21 research outputs found

    Data Descriptor : Long-term chloride concentrations in North American and European freshwater lakes

    Get PDF
    Anthropogenic sources of chloride in a lake catchment, including road salt, fertilizer, and wastewater, can elevate the chloride concentration in freshwater lakes above background levels. Rising chloride concentrations can impact lake ecology and ecosystem services such as fisheries and the use of lakes as drinking water sources. To analyze the spatial extent and magnitude of increasing chloride concentrations in freshwater lakes, we amassed a database of 529 lakes in Europe and North America that had greater than or equal to ten years of chloride data. For each lake, we calculated climate statistics of mean annual total precipitation and mean monthly air temperatures from gridded global datasets. We also quantified land cover metrics, including road density and impervious surface, in buffer zones of 100 to 1,500m surrounding the perimeter of each lake. This database represents the largest global collection of lake chloride data. We hope that long-term water quality measurements in areas outside Europe and North America can be added to the database as they become available in the future.Peer reviewe

    Distributions and parameter values used in analytic framework and model simulations.

    No full text
    The latent period is defined as the time between exposure and onset of infectiousness, the incubation period as the time between exposure and both symptoms and peak infectiousness (even in the absence of symptoms), and the infectious period as the total time a case is infectious.</p

    Number of expected infections generated in a facility from model simulations comparing random and systematic testing strategies across transmission scenarios, test frequencies, and delays isolating infectious individuals who have tested positive.

    No full text
    Systematic testing strategies (â– , âž•) prevent more infections than random strategies (â—Ź, â–˛) across all transmission scenarios (indicated by community prevalence across the x axis and by reproduction number across the panels) and test frequencies (indicated by different colored symbols with blue corresponding to the highest test frequency of 4 tests per week and red the lowest test frequency of biweekly testing). More infections are expected in transmission scenarios with higher within-facility and higher community prevalence. Preventing delays between testing and isolation of positives (squares compared to crosses and triangles compared to circles) and increasing test frequency (red = lowest frequency, blue = highest frequency) also reduces the number of infections. The horizontal gray line serves as a reference to assess the testing frequency needed to maintain (corresponding to one infection every ten days) across different transmission scenarios. Error bars represent the interquartile range of derived from 100 simulations per scenario run for 180 days among 700 staff.</p

    Analytic framework exploring effects of variable infectiousness through time, testing frequencies, and delays on SARS-CoV-2 transmission.

    No full text
    A) Example infectiousness profile for , tlatent = 4, tincubation = 5, tinfectious = 9, with line indicating infectiousness (rt) through time and shaded area demonstrating infectiousness slice removed if tiso = 7, leading to . B) as a function of tiso with same parameters as in A and point indicating scenario depicted in A. C) Boxplots showing distributions of as a function of testing frequency, f, and delay in obtaining test results, d, incorporating uncertainty in tlatent, tincubation, and tinfectious by drawing n = 100 parameter sets for each, with baseline . Boxplots indicate median, interquartile range, and full range of values of . D) Probability isolation occurs as a function of testing frequency, f, delay in obtaining test results, d, and days from exposure to isolation τ, i.e. tiso≤τ, demonstrating that delays in obtaining test results substantially reduce the probability of prompt isolation, particularly among most frequent testing scenarios.</p

    Outdoor Residential Water Use Restrictions during Recent Drought Suppressed Disease Vector Abundance in Southern California

    No full text
    The California state government put restrictions on outdoor residential water use, including landscape irrigation, during the 2012-2016 drought. The public health implications of these actions are largely unknown, particularly with respect to mosquito-borne disease transmission. While residential irrigation facilitates persistence of mosquitoes by increasing the availability of standing water, few studies have investigated its effects on vector abundance. In two study sub-regions in the Los Angeles Basin, we examined the effect of outdoor residential water use restrictions on the abundance of the most important regional West Nile virus vector, Culex quinquefasciatus. Using spatiotemporal random forest models fit to Cx. abundance during drought and non-drought years, we generated counterfactual estimates of Cx. abundance under a hypothetical drought scenario without water use restrictions. We estimate that Cx. abundance would have been 44% and 39% larger in West Los Angeles and Orange counties, respectively, if outdoor water usage had remained unchanged. Our results suggest that drought, without mandatory water use restrictions, may counterintuitively increase the availability of larval habitats for vectors in naturally dry, highly irrigated settings and such mandatory water use restrictions may constrain Cx. abundance, which could reduce the risk of mosquito-borne disease while helping urban utilities maintain adequate water supplies
    corecore