77 research outputs found

    LoCoH: nonparameteric kernel methods for constructing home ranges and utilization distributions.

    Get PDF
    Parametric kernel methods currently dominate the literature regarding the construction of animal home ranges (HRs) and utilization distributions (UDs). These methods frequently fail to capture the kinds of hard boundaries common to many natural systems. Recently a local convex hull (LoCoH) nonparametric kernel method, which generalizes the minimum convex polygon (MCP) method, was shown to be more appropriate than parametric kernel methods for constructing HRs and UDs, because of its ability to identify hard boundaries (e.g., rivers, cliff edges) and convergence to the true distribution as sample size increases. Here we extend the LoCoH in two ways: "fixed sphere-of-influence," or r-LoCoH (kernels constructed from all points within a fixed radius r of each reference point), and an "adaptive sphere-of-influence," or a-LoCoH (kernels constructed from all points within a radius a such that the distances of all points within the radius to the reference point sum to a value less than or equal to a), and compare them to the original "fixed-number-of-points," or k-LoCoH (all kernels constructed from k-1 nearest neighbors of root points). We also compare these nonparametric LoCoH to parametric kernel methods using manufactured data and data collected from GPS collars on African buffalo in the Kruger National Park, South Africa. Our results demonstrate that LoCoH methods are superior to parametric kernel methods in estimating areas used by animals, excluding unused areas (holes) and, generally, in constructing UDs and HRs arising from the movement of animals influenced by hard boundaries and irregular structures (e.g., rocky outcrops). We also demonstrate that a-LoCoH is generally superior to k- and r-LoCoH (with software for all three methods available at http://locoh.cnr.berkeley.edu)

    LoCoH : nonparameteric kernel methods for constructing home ranges and utilization distributions

    Get PDF
    Parametric kernel methods currently dominate the literature regarding the construction of animal home ranges (HRs) and utilization distributions (UDs). These methods frequently fail to capture the kinds of hard boundaries common to many natural systems. Recently a local convex hull (LoCoH) nonparametric kernel method, which generalizes the minimum convex polygon (MCP) method, was shown to be more appropriate than parametric kernel methods for constructing HRs and UDs, because of its ability to identify hard boundaries (e.g., rivers, cliff edges) and convergence to the true distribution as sample size increases. Here we extend the LoCoH in two ways: ‘‘fixed sphere-of-influence,’’ or r-LoCoH (kernels constructed from all points within a fixed radius r of each reference point), and an ‘‘adaptive sphere-of-influence,’’ or a-LoCoH (kernels constructed from all points within a radius a such that the distances of all points within the radius to the reference point sum to a value less than or equal to a), and compare them to the original ‘‘fixed-number-of-points,’’ or k-LoCoH (all kernels constructed from k-1 nearest neighbors of root points). We also compare these nonparametric LoCoH to parametric kernel methods using manufactured data and data collected from GPS collars on African buffalo in the Kruger National Park, South Africa. Our results demonstrate that LoCoH methods are superior to parametric kernel methods in estimating areas used by animals, excluding unused areas (holes) and, generally, in constructing UDs and HRs arising from the movement of animals influenced by hard boundaries and irregular structures (e.g., rocky outcrops). We also demonstrate that a-LoCoH is generally superior to k- and r-LoCoH (with software for all three methods available at http://locoh.cnr.berkeley.edu).NSF EID and James S McDonnell Foundation Grants, and NSF Bioinformatics Postdoctoral Grands.https://journals.plos.org/plosone/am2024Mammal Research InstituteZoology and EntomologySDG-15:Life on lan

    Contingent Kernel Density Estimation

    Get PDF
    Kernel density estimation is a widely used method for estimating a distribution based on a sample of points drawn from that distribution. Generally, in practice some form of error contaminates the sample of observed points. Such error can be the result of imprecise measurements or observation bias. Often this error is negligible and may be disregarded in analysis. In cases where the error is non-negligible, estimation methods should be adjusted to reduce resulting bias. Several modifications of kernel density estimation have been developed to address specific forms of errors. One form of error that has not yet been addressed is the case where observations are nominally placed at the centers of areas from which the points are assumed to have been drawn, where these areas are of varying sizes. In this scenario, the bias arises because the size of the error can vary among points and some subset of points can be known to have smaller error than another subset or the form of the error may change among points. This paper proposes a “contingent kernel density estimation” technique to address this form of error. This new technique adjusts the standard kernel on a point-by-point basis in an adaptive response to changing structure and magnitude of error. In this paper, equations for our contingent kernel technique are derived, the technique is validated using numerical simulations, and an example using the geographic locations of social networking users is worked to demonstrate the utility of the method

    A web-based Toolbox to support the systemic eco-efficiency assessment in water use systems

    Get PDF
    The eco-efficiency assessment of a water use system at the meso level, as well as the estimation of the anticipated eco-efficiency improvements as a result of innovative practices/technologies, is a conceptually and methodologically challenging issue. A systemic approach is required to capture the complexity of all interrelated aspects and the interactions among the heterogeneous actors involved in the system. This involves mapping the behaviour of the system into representative models, structuring the analysis in easy to understand procedures and developing versatile software tools for supporting the analysis. This paper presents a web-integrated suite of tools and resources (EcoWater Toolbox) for assessing eco-efficiency improvements from innovative technologies in water use systems. Equipped with a continuously updated inventory of currently available technological innovations as well as a repository of eco-efficiency indicators and their evaluation rules, the EcoWater Toolbox supports a comprehensive four-step eco-efficiency assessment of a water use system: (1) allows the users to frame the case study by defining system boundaries, describing the water supply chain and value chains and including all the actors; (2) helps the users to establish a baseline eco-efficiency assessment, using the integrated modelling tools; (3) supports the users in identifying both sector-specific and system-wide technologies and practices to suit their situation, through the integrated technology inventory; and (4) enables the users to assess innovative technology solutions by developing predictive technology scenarios and comparing these with baseline results. At the core of the Toolbox are two modelling tools, which combine both economic and environmental viewpoints into a single modelling framework. The “Systemic Environmental Analysis Tool” (SEAT) assists in building a representation of the physical system, its processes and interactions and forms the basis for evaluating the environmental performance of the system. The “Economic Value chain Analysis Tool” (EVAT) addresses the value chain and focuses on the economic component of the eco-efficiency. Both tools provide a graphical model construction interface that is implemented in client-side and incorporate advanced features such as model scripting. The methodology adopted and the operational aspects of the EcoWater Toolbox are presented and demonstrated through the assessment of the eco-efficiency performance associated with the water value chain in the case of a milk production unit of a dairy industry

    Examining sources of land tenure (in)security. A focus on authority relations, state politics, social dynamics and belonging

    Get PDF
    This article reviews the current state of literature on the notion of security of tenure of land. This examination is topical as tenure security has become a key objective for land policies and development interventions. While tenure security is widely defined by people’s perceptions, land policies tend to address it through the registration and administration of land rights. The article argues that these practices ignore the complexity of the sources of tenure (in)security. Building on critical development literature of political ecology, social anthropology and political science, these sources are identified as stemming from the politics of land and linked to authority relations, state politics, social dynamics and belonging. The article concludes that their consideration enables us to contextualise perceptions of tenure security and to conceive practices for securing tenure

    Multiple dimensions of biodiversity drive human interest in tide pool communities

    Get PDF
    Abstract Activities involving observation of wild organisms (e.g. wildlife watching, tidepooling) can provide recreational and learning opportunities, with biologically diverse animal assemblages expected to be more stimulating to humans. In turn, more diverse communities may enhance human interest and facilitate provisioning of cultural services. However, no experimental tests of this biodiversity-interest hypothesis exist to date. We therefore investigated the effects of different dimensions of animal biodiversity (species richness, phyletic richness and functional diversity) on self-reported interest using tide pools as a model system. We performed two experiments by manipulating: (1) the richness of lower (species) and higher taxonomic levels (phyla) in an image based, online survey, and (2) the richness of the higher taxonomic level (phyla) in live public exhibits. In both experiments, we further quantified functional diversity, which varied freely, and within the online experiment we also included the hue diversity and colourfulness arising from the combination of organisms and the background scenes. Interest was increased by phyletic richness (both studies), animal species richness (online study) and functional diversity (online study). A structural equation model revealed that functional diversity and colourfulness (of the whole scene) also partially mediated the effects of phyletic richness on interest in the online study. In both studies, the presence of three of four phyla additively increased interest, supporting the importance of multiple, diverse phyla rather than a single particularly interesting phylum. These results provide novel experimental evidence that multiple dimensions of biodiversity enhance human interest and suggest that conservation initiatives that maintain or restore biodiversity will help stimulate interest in ecosystems, facilitating educational and recreational benefits

    Methods for Comparative Model Selection and Parameter Estimation in Diverse Modeling Applications

    No full text
    Predictive accuracy of a model is of key importance in research and to a lay audience. Diverse modeling methods and parameter estimation methods exist, such that a wide range of techniques are available from which to select when approaching a modeling task. Given this, two questions naturally arise in relation to a modeling task: model selection and model parameter estimation. This dissertation is intended to advance the theory and practice of model selection and parameter estimation for the topics discussed here. * In Chapter 2, I develop A3, a novel method for assessing predictive accuracy and enabling direct comparisons between competing models in an accessible framework. This method uses resampling techniques to "wrap" predictive modeling methods and estimate a standard set of error metrics for both the model as a whole and additionally for each explanatory variable utilized by the model. Two case studies in the chapter illustrate the applied utility of the method and how improved models may not only result in increased predictive accuracy, but also potentially alter inferences and conclusions about the effects of parameters in the model. An R package implementing the method is made available on CRAN.* In Chapter 3, I develop ICE, a novel method of home range estimation. ICE uses a competitive method for estimating home ranges. Effectively, an estimator of estimators, ICE pits existing home range estimators against each other, each of which may be best suited for a given type of data. By selecting between different approaches, ICE can theoretically improve on the performance of any individual estimator across heterogeneous data sets.* In Chapter 4, I develop Contingent Kernel Density Estimation, an extension to Kernel Density Estimation designed to account for the case when observations are measured with a specific form of error. Chapter 4 develops the method and derives contingent kernels for commonly-used kernels and sampling regimes. An application of the method is presented to data collected from the social networking site, Twitter, to estimate the national distribution of a sample of Twitter users.* The study in Chapter 5 analyzes a large data set collected from Twitter. This study is based on data from over four million Twitter users and estimates parameters of this population with a primary focus on color preference choices made by these users. Novel results are found in this "big data" analysis approach that may not have been able to be identified with earlier, traditional approaches of sampling and surveying the behavior of individuals

    Insight Maker: A general-purpose tool for web-based modeling & simulation

    Get PDF
    A web-based, general-purpose simulation and modeling tool is presented in this paper. The tool, Insight Maker, has been designed to make modeling and simulation accessible to a wider audience of users. Insight Maker integrates three general modeling approaches – System Dynamics, Agent-Based Modeling, and imperative programming – in a unified modeling framework. The environment provides a graphical model construction interface that is implemented purely in client-side code that runs on users’ machines. Advanced features, such as model scripting and an optimization tool, are also described. Insight Maker, under development for several years, has gained significant adoption with currently more than 20,000 registered users. In addition to detailing the tool and its guiding philosophy, this first paper on Insight Maker describes lessons learned from the development of a complex web-based simulation and modeling tool
    corecore