8,661,550 research outputs found

    To model micropollutants or not to model ... that is the question

    Get PDF
    Contaminants of Emerging Concern (CECs) in the aquatic environment has become an environmental issue of growing global concern. Their monitoring is discontinuous, costly and time-consuming, strongly depending on the adopted analytical methods, not permitting to have a comprehensive view of their presence and dynamics in the environmental compartments (drinking water, wastewater, natural water, crops). Consequently, the risk for the environment and human health could be even significantly underestimated. Modelling tools are thus fundamental to support monitoring and management of CECs, in an integrated framework oriented to the overall risk minimization. Here, the following modelling tools are presented: (i) methods to manage CECs concentration data under the Limit of Quantification; (ii) stochastic methods to support the generalization and interpretation of literature outputs; (iii) fate models to describe CECs dynamics in interconnected environmental compartments, to be used for forward and backward predictions, and thus supporting CECs prioritization and risk-based corrective actions

    To Model or Not to Model? Competing Modes of Inference for Finite Population Sampling

    Get PDF
    Finite population sampling is perhaps the only area of statistics where the primary mode of analysis is based on the randomization distribution, rather than on statistical models for the measured variables. This article reviews the debate between design and model-based inference. The basic features of the two approaches are illustrated using the case of inference about the mean from stratified random samples. Strengths and weakness of design-based and model-based inference for surveys are discussed. It is suggested that models that take into account the sample design and make weak parametric assumptions can produce reliable and efficient inferences in surveys settings. These ideas are illustrated using the problem of inference from unequal probability samples. A model-based regression analysis that leads to a combination of design-based and model-based weighting is described

    Lotka-VolteraImpulsive Model Control or How not to fish

    Get PDF
    Human intervention in nature often destroys the equilibrium that has been established in it. It is very important to find controls that would not destroy the stability of the natural equilibrium in the ecosystem. The impulsive systems theory allows us to investigate such effects on a biological system with a simpler example that do not destroy its equilibrium

    #Santiago is not #Chile, or is it? A Model to Normalize Social Media Impact

    Full text link
    Online social networks are known to be demographically biased. Currently there are questions about what degree of representativity of the physical population they have, and how population biases impact user-generated content. In this paper we focus on centralism, a problem affecting Chile. Assuming that local differences exist in a country, in terms of vocabulary, we built a methodology based on the vector space model to find distinctive content from different locations, and use it to create classifiers to predict whether the content of a micro-post is related to a particular location, having in mind a geographically diverse selection of micro-posts. We evaluate them in a case study where we analyze the virtual population of Chile that participated in the Twitter social network during an event of national relevance: the municipal (local governments) elections held in 2012. We observe that the participating virtual population is spatially representative of the physical population, implying that there is centralism in Twitter. Our classifiers out-perform a non geographically-diverse baseline at the regional level, and have the same accuracy at a provincial level. However, our approach makes assumptions that need to be tested in multi-thematic and more general datasets. We leave this for future work.Comment: Accepted in ChileCHI 2013, I Chilean Conference on Human-Computer Interactio

    To Peg or Not To Peg? A Simple Model of Exchange Rate Regime Choice In Small Economies

    Get PDF
    The choice of an exchange rate peg often points to a trade-off between gaining credibility and losing flexibility. We show that the flexibility loss may be reduced if domestic and foreign shocks are coorelated and more volatile. Allowing for a plausible structural change after a peg, a flexibility gain may result.Exchange rate regime choice; credibility versus flexibility; international spill-overs; imported stabilization

    To Score or Not to Score? Estimates of a Sponsored Search Auction Model

    Get PDF
    We estimate a structural model of a sponsored search auction model. To accomodate the "position paradox", we relax the assumption of decreasing click volumes with position ranks, which is often assumed in the literature. Using data from "Website X", one of the largest online market places in China, we find that merchants of different qualities adopt different bidding strategies: high quality merchants bid more aggressively for informative keywords, while low quality merchants are more likely to be sorted to the top positions for value keywords. Counterfactual evaluations show that the price trend becomes steeper after moving to a score-weighted generalized second price auction, with much higher prices obtained for the top position but lower prices for the other positions. Overall, there is only a very modest change in total revenue from introducing popularity scoring, despite the intent in bid scoring to reward popular merchants with price discounts
    • …
    corecore