93 research outputs found

    Thresholds of terrestrial nutrient loading for the development of eutrophication episodes in a coastal embayment in the Aegean Sea

    Get PDF
    Thresholds of terrestrial nutrient loading (inorganic N and P) for the development of eutrophication episodes were estimated in an enclosed embayment, the gulf of Kalloni, in the Aegean, Eastern Mediterranean. Terrestrial loading was quantified by a watershed runoff model taking into account land use, geomorphology, sewerage, industrial and animal farming by-products. The eutrophication episodes were assessed by an existing scale for the Aegean coastal waters based on chl a, whereas the necessary nutrient concentrations (N and P) for the development of such episodes were defined using a probabilistic procedure. Finally, for the linking between nutrient loading arriving at the gulf and the resulting nutrient enrichment of the marine ecosystem, three loading factors were applied, developed by Vollenweider for lake and marine ecosystems. The first assumes no exchange between the embayment and the open sea, whereas the two others take into account water renewal time. Only the threshold for inorganic nitrogen estimated by the first factor was exceeded in the study area during February after a strong rainfall event coinciding with a eutrophication episode observed in the interior of the gulf, implying that the waters of the gulf are rather confined and the receiving body operates as a lake. The degree of confinement was further examined by studying the temperature, salinity, and density distributions inside the gulf and across the channel connecting the gulf to the open sea. It was found that the incoming freshwater from the watershed during winter results to the formation of a dilute surface layer of low salinity and density, clearly isolated from the open sea. The nutrients from the river inputs are diluted into this isolated water mass and the eutrophication threshold for nitrogen is exceeded. Although phosphorus loading was also high during winter, the corresponding limits were never exceeded. The proposed methodology sets a quantitative relationship between terrestrial nutrient loading and the development of eutrophication episodes in coastal embayments, assuming that information on the physical setting of the system is available. These cause-and-effect relationships can be invaluable tools for managers and decision makers in the framework of Integrated Coastal Zone Management

    Lumpy species coexistence arises robustly in fluctuating resource environments

    Get PDF
    The effect of life-history traits on resource competition outcomes is well understood in the context of a constant resource supply. However, almost all natural systems are subject to fluctuations of resources driven by cyclical processes such as seasonality and tidal hydrology. To understand community composition, it is therefore imperative to study the impact of resource fluctuations on interspecies competition. We adapted a well-established resource-competition model to show that fluctuations in inflow concentrations of two limiting resources lead to the survival of species in clumps along the trait axis, consistent with observations of “lumpy coexistence” [Scheffer M, van Nes EH (2006) Proc Natl Acad Sci USA 103:6230–6235]. A complex dynamic pattern in the available ambient resources arose very early in the self-organization process and dictated the locations of clumps along the trait axis by creating niches that promoted the growth of species with specific traits. This dynamic pattern emerged as the combined result of fluctuations in the inflow of resources and their consumption by the most competitive species that accumulated the bulk of biomass early in assemblage organization. Clumps emerged robustly across a range of periodicities, phase differences, and amplitudes. Given the ubiquity in the real world of asynchronous fluctuations of limiting resources, our findings imply that assemblage organization in clumps should be a common feature in nature

    Decisions, Counterfactual Explanations and Strategic Behavior

    Full text link
    As data-driven predictive models are increasingly used to inform decisions, it has been argued that decision makers should provide explanations that help individuals understand what would have to change for these decisions to be beneficial ones. However, there has been little discussion on the possibility that individuals may use the above counterfactual explanations to invest effort strategically and maximize their chances of receiving a beneficial decision. In this paper, our goal is to find policies and counterfactual explanations that are optimal in terms of utility in such a strategic setting. We first show that, given a pre-defined policy, the problem of finding the optimal set of counterfactual explanations is NP-hard. Then, we show that the corresponding objective is nondecreasing and satisfies submodularity and this allows a standard greedy algorithm to enjoy approximation guarantees. In addition, we further show that the problem of jointly finding both the optimal policy and set of counterfactual explanations reduces to maximizing a non-monotone submodular function. As a result, we can use a recent randomized algorithm to solve the problem, which also offers approximation guarantees. Finally, we demonstrate that, by incorporating a matroid constraint into the problem formulation, we can increase the diversity of the optimal set of counterfactual explanations and incentivize individuals across the whole spectrum of the population to self improve. Experiments on synthetic and real lending and credit card data illustrate our theoretical findings and show that the counterfactual explanations and decision policies found by our algorithms achieve higher utility than several competitive baselines.Comment: New data preprocessing method, experiments on credit card data and experiments under a matroid constrain

    Counterfactual Explanations in Sequential Decision Making Under Uncertainty

    Get PDF
    Methods to find counterfactual explanations have predominantly focused on one step decision making processes. In this work, we initiate the development of methods to find counterfactual explanations for decision making processes in which multiple, dependent actions are taken sequentially over time. We start by formally characterizing a sequence of actions and states using finite horizon Markov decision processes and the Gumbel-Max structural causal model. Building upon this characterization, we formally state the problem of finding counterfactual explanations for sequential decision making processes. In our problem formulation, the counterfactual explanation specifies an alternative sequence of actions differing in at most k actions from the observed sequence that could have led the observed process realization to a better outcome. Then, we introduce a polynomial time algorithm based on dynamic programming to build a counterfactual policy that is guaranteed to always provide the optimal counterfactual explanation on every possible realization of the counterfactual environment dynamics. We validate our algorithm using both synthetic and real data from cognitive behavioral therapy and show that the counterfactual explanations our algorithm finds can provide valuable insights to enhance sequential decision making under uncertainty

    Group Testing under Superspreading Dynamics

    Get PDF
    Testing is recommended for all close contacts of confirmed COVID-19 patients. However, existing group testing methods are oblivious to the circumstances of contagion provided by contact tracing. Here, we build upon a well-known semi-adaptive pool testing method, Dorfman's method with imperfect tests, and derive a simple group testing method based on dynamic programming that is specifically designed to use the information provided by contact tracing. Experiments using a variety of reproduction numbers and dispersion levels, including those estimated in the context of the COVID-19 pandemic, show that the pools found using our method result in a significantly lower number of tests than those found using standard Dorfman's method, especially when the number of contacts of an infected individual is small. Moreover, our results show that our method can be more beneficial when the secondary infections are highly overdispersed

    Species extinctions strengthen the relationship between biodiversity and resource use efficiency

    Get PDF
    Evidence from terrestrial ecosystems indicates that biodiversity relates to ecosystem functions (BEF), but this relationship varies in its strength, in part, as a function of habitat connectivity and fragmentation. In primary producers, common proxies of ecosystem function include productivity and resource use efficiency. In aquatic primary producers, macroecological studies have observed BEF variance, where ecosystems with lower richness show stronger BEF relationships. However, aquatic ecosystems are less affected by habitat fragmentation than terrestrial systems and the mechanism underlying this BEF variance has been largely overlooked. Here, we provide a mechanistic explanation of BEF variance using a trait-based, numerical model parameterized for phytoplankton. Resource supply in our model fluctuates recurrently, similar to many coastal systems. Our findings show that following an extinction event, the BEF relationship can be driven by the species that are the most efficient resource users. Specifically, in species-rich assemblages, increased redundancy of efficient resource users minimizes the risk of losing function following an extinction event. On the other hand, in species-poor assemblages, low redundancy of efficient resource users increases the risk of losing ecosystem function following extinctions. Furthermore, we corroborate our findings with what has been observed from large-scale field studies on phytoplankton

    Optimized classification predictions with a new index combining machine learning algorithms

    Get PDF
    Voting is a commonly used ensemble method aiming to optimize classification predictions by combining results from individual base classifiers. However, the selection of appropriate classifiers to participate in voting algorithm is currently an open issue. In this study we developed a novel Dissimilarity-Performance (DP) index which incorporates two important criteria for the selection of base classifiers to participate in voting: their differential response in classification (dissimilarity) when combined in triads and their individual performance. To develop this empirical index we firstly used a range of different datasets to evaluate the relationship between voting results and measures of dissimilarity among classifiers of different types (rules, trees, lazy classifiers, functions and Bayes). Secondly, we computed the combined effect on voting performance of classifiers with different individual performance and/or diverse results in the voting performance. Our DP index was able to rank the classifier combinations according to their voting performance and thus to suggest the optimal combination. The proposed index is recommended for individual machine learning users as a preliminary tool to identify which classifiers to combine in order to achieve more accurate classification predictions avoiding computer intensive and time-consuming search

    Quantifying the Effects of Contact Tracing, Testing, and Containment Measures in the Presence of Infection Hotspots

    Get PDF
    Multiple lines of evidence strongly suggest that infection hotspots, where a single individual infects many others, play a key role in the transmission dynamics of COVID-19. However, most of the existing epidemiological models fail to capture this aspect by neither representing the sites visited by individuals explicitly nor characterizing disease transmission as a function of individual mobility patterns. In this work, we introduce a temporal point process modeling framework that specifically represents visits to the sites where individuals get in contact and infect each other. Under our model, the number of infections caused by an infectious individual naturally emerges to be overdispersed. Using an efficient sampling algorithm, we demonstrate how to apply Bayesian optimization with longitudinal case data to estimate the transmission rate of infectious individuals at the sites they visit and in their households. Simulations using fine-grained and publicly available demographic data and site locations from Bern, Switzerland showcase the flexibility of our framework. To facilitate research and analyses of other cities and regions, we release an open-source implementation of our framework

    Quantifying the Effects of Contact Tracing, Testing, and Containment

    Get PDF
    Contact tracing has the potential to help identify, characterize, and predict disease-spreading human interactions at an unprecedented resolution. However, to realize this potential, we need to utilize data-driven epidemic models that can operate at a high spatiotemporal resolution and make use of and benefit from contact tracing data of individuals. Such data-driven models are currently missing, and in this work we initiate their development using the framework of temporal point processes. Using an efficient sampling algorithm, we can use our model to quantify the effects that different testing and tracing strategies, social distancing measures, and business restrictions may have on the course of the disease. Building on this algorithm, we use Bayesian optimization to estimate the transmission rate due to infectious individuals at the sites they visit and at their households as well as the mobility reduction due to social distancing from longitudinal case data. Simulations using real COVID-19 case data and mobility patterns from several cities and regions in Germany and Switzerland with a wide range of infection levels until today demonstrate that our model may allow individuals and policy makers to make more effective decisions.Comment: Extensive results and additional analysis; refined parameter estimation
    corecore