211 research outputs found

    Fast Locality-Sensitive Hashing Frameworks for Approximate Near Neighbor Search

    Full text link
    The Indyk-Motwani Locality-Sensitive Hashing (LSH) framework (STOC 1998) is a general technique for constructing a data structure to answer approximate near neighbor queries by using a distribution H\mathcal{H} over locality-sensitive hash functions that partition space. For a collection of nn points, after preprocessing, the query time is dominated by O(nρlogn)O(n^{\rho} \log n) evaluations of hash functions from H\mathcal{H} and O(nρ)O(n^{\rho}) hash table lookups and distance computations where ρ(0,1)\rho \in (0,1) is determined by the locality-sensitivity properties of H\mathcal{H}. It follows from a recent result by Dahlgaard et al. (FOCS 2017) that the number of locality-sensitive hash functions can be reduced to O(log2n)O(\log^2 n), leaving the query time to be dominated by O(nρ)O(n^{\rho}) distance computations and O(nρlogn)O(n^{\rho} \log n) additional word-RAM operations. We state this result as a general framework and provide a simpler analysis showing that the number of lookups and distance computations closely match the Indyk-Motwani framework, making it a viable replacement in practice. Using ideas from another locality-sensitive hashing framework by Andoni and Indyk (SODA 2006) we are able to reduce the number of additional word-RAM operations to O(nρ)O(n^\rho).Comment: 15 pages, 3 figure

    Computing tolerance parameters for fixturing and feeding

    Get PDF
    Fixtures and feeders are important components of automated assembly systems: fixtures accurately hold parts and feeders move parts into alignment. These components can fail when part shape varies. Parametric tolerance classes specify how much variation is allowable. In this paper we consider fixturing convex polygonal parts using right-angle brackets and feeding polygonal parts on conveyor belts using sequences of vertical fences. For both cases, we define new tolerance classes and give algorithms for computing the parameter specifications such that the fixture or feeder will work for all parts in the tolerance class. For fixturing we give an O(1) algorithm to compute the dimensions of rectangular tolerance zones. For feeding we give an O(n2) algorithm to compute the radius of the largest allowable tolerance zone around each vertex. For each, we give an O(n) time algorithm for testing if an n-sided part is in the tolerance class

    Bringing Order to Special Cases of Klee's Measure Problem

    Full text link
    Klee's Measure Problem (KMP) asks for the volume of the union of n axis-aligned boxes in d-space. Omitting logarithmic factors, the best algorithm has runtime O*(n^{d/2}) [Overmars,Yap'91]. There are faster algorithms known for several special cases: Cube-KMP (where all boxes are cubes), Unitcube-KMP (where all boxes are cubes of equal side length), Hypervolume (where all boxes share a vertex), and k-Grounded (where the projection onto the first k dimensions is a Hypervolume instance). In this paper we bring some order to these special cases by providing reductions among them. In addition to the trivial inclusions, we establish Hypervolume as the easiest of these special cases, and show that the runtimes of Unitcube-KMP and Cube-KMP are polynomially related. More importantly, we show that any algorithm for one of the special cases with runtime T(n,d) implies an algorithm for the general case with runtime T(n,2d), yielding the first non-trivial relation between KMP and its special cases. This allows to transfer W[1]-hardness of KMP to all special cases, proving that no n^{o(d)} algorithm exists for any of the special cases under reasonable complexity theoretic assumptions. Furthermore, assuming that there is no improved algorithm for the general case of KMP (no algorithm with runtime O(n^{d/2 - eps})) this reduction shows that there is no algorithm with runtime O(n^{floor(d/2)/2 - eps}) for any of the special cases. Under the same assumption we show a tight lower bound for a recent algorithm for 2-Grounded [Yildiz,Suri'12].Comment: 17 page

    Computing the Fréchet Distance with a Retractable Leash

    Get PDF
    All known algorithms for the Fréchet distance between curves proceed in two steps: first, they construct an efficient oracle for the decision version; second, they use this oracle to find the optimum from a finite set of critical values. We present a novel approach that avoids the detour through the decision version. This gives the first quadratic time algorithm for the Fréchet distance between polygonal curves in (Formula presented.) under polyhedral distance functions (e.g., (Formula presented.) and (Formula presented.)). We also get a (Formula presented.)-approximation of the Fréchet distance under the Euclidean metric, in quadratic time for any fixed (Formula presented.). For the exact Euclidean case, our framework currently yields an algorithm with running time (Formula presented.). However, we conjecture that it may eventually lead to a faster exact algorithm

    Testing for Network and Spatial Autocorrelation

    Full text link
    Testing for dependence has been a well-established component of spatial statistical analyses for decades. In particular, several popular test statistics have desirable properties for testing for the presence of spatial autocorrelation in continuous variables. In this paper we propose two contributions to the literature on tests for autocorrelation. First, we propose a new test for autocorrelation in categorical variables. While some methods currently exist for assessing spatial autocorrelation in categorical variables, the most popular method is unwieldy, somewhat ad hoc, and fails to provide grounds for a single omnibus test. Second, we discuss the importance of testing for autocorrelation in data sampled from the nodes of a network, motivated by social network applications. We demonstrate that our proposed statistic for categorical variables can both be used in the spatial and network setting

    Developing serious games for cultural heritage: a state-of-the-art review

    Get PDF
    Although the widespread use of gaming for leisure purposes has been well documented, the use of games to support cultural heritage purposes, such as historical teaching and learning, or for enhancing museum visits, has been less well considered. The state-of-the-art in serious game technology is identical to that of the state-of-the-art in entertainment games technology. As a result, the field of serious heritage games concerns itself with recent advances in computer games, real-time computer graphics, virtual and augmented reality and artificial intelligence. On the other hand, the main strengths of serious gaming applications may be generalised as being in the areas of communication, visual expression of information, collaboration mechanisms, interactivity and entertainment. In this report, we will focus on the state-of-the-art with respect to the theories, methods and technologies used in serious heritage games. We provide an overview of existing literature of relevance to the domain, discuss the strengths and weaknesses of the described methods and point out unsolved problems and challenges. In addition, several case studies illustrating the application of methods and technologies used in cultural heritage are presented

    Exploring subtle land use and land cover changes: a framework for future landscape studies

    Get PDF
    UMR AMAP, équipe 3International audienceLand cover and land use changes can have a wide variety of ecological effects, including significant impacts on soils and water quality. In rural areas, even subtle changes in farming practices can affect landscape features and functions, and consequently the environment. Fine-scale analyses have to be performed to better understand the land cover change processes. At the same time, models of land cover change have to be developed in order to anticipate where changes are more likely to occur next. Such predictive information is essential to propose and implement sustainable and efficient environmental policies. Future landscape studies can provide a framework to forecast how land use and land cover changes is likely to react differently to subtle changes. This paper proposes a four step framework to forecast landscape futures at fine scales by coupling scenarios and landscape modelling approaches. This methodology has been tested on two contrasting agricultural landscapes located in the United States and France, to identify possible landscape changes based on forecasting and backcasting agriculture intensification scenarios. Both examples demonstrate that relatively subtle land cover and land use changes can have a large impact on future landscapes. Results highlight how such subtle changes have to be considered in term of quantity, location, and frequency of land use and land cover to appropriately assess environmental impacts on water pollution (France) and soil erosion (US). The results highlight opportunities for improvements in landscape modelling

    "The only vaccine that we really question is the new vaccine": A qualitative exploration of the social and behavioural drivers of human papillomavirus (HPV) vaccination in Tonga.

    Full text link
    INTRODUCTION: Human papillomavirus (HPV) vaccination is crucial for cervical cancer elimination, particularly in the Pacific where screening and treatment are limited. The HPV vaccine was introduced through schools in Tonga in November 2022 for adolescent girls. Despite high routine childhood vaccine coverage in Tonga, uptake of the HPV vaccine has been slow. This study explored the social and behavioural drivers of HPV and routine childhood vaccination in Tonga to inform tailored strategies to increase vaccine uptake. METHODS: We conducted qualitative interviews and focus groups in Nuku'alofa between June and October 2023 with parents (n = 32), adolescent girls (n = 24), teachers (n = 15), nurses (n = 7), and immunization staff (n = 5). Data were analysed thematically and mapped to the World Health Organization's Behavioural and Social Drivers of vaccination framework. RESULTS: Parents, teachers, and girls had limited knowledge of the HPV vaccine. Some feared it would encourage promiscuity or impact fertility. While trust in routine childhood vaccines was high, participants felt the COVID-19 pandemic had reduced confidence in new vaccines. Some vaccinated girls felt the HPV vaccine offered protection whereas others were afraid of side effects. Practical barriers included non-standardised consent forms that had to be returned to schools, the vaccine rollout timing, and school participation. CONCLUSION: Providing youth, parents and teachers with accurate, culturally appropriate information and supporting teachers to discuss vaccination and facilitate consent may improve HPV vaccine uptake in Tonga

    Parallel Write-Efficient Algorithms and Data Structures for Computational Geometry

    Full text link
    In this paper, we design parallel write-efficient geometric algorithms that perform asymptotically fewer writes than standard algorithms for the same problem. This is motivated by emerging non-volatile memory technologies with read performance being close to that of random access memory but writes being significantly more expensive in terms of energy and latency. We design algorithms for planar Delaunay triangulation, kk-d trees, and static and dynamic augmented trees. Our algorithms are designed in the recently introduced Asymmetric Nested-Parallel Model, which captures the parallel setting in which there is a small symmetric memory where reads and writes are unit cost as well as a large asymmetric memory where writes are ω\omega times more expensive than reads. In designing these algorithms, we introduce several techniques for obtaining write-efficiency, including DAG tracing, prefix doubling, reconstruction-based rebalancing and α\alpha-labeling, which we believe will be useful for designing other parallel write-efficient algorithms

    SeqAn An efficient, generic C++ library for sequence analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The use of novel algorithmic techniques is pivotal to many important problems in life science. For example the sequencing of the human genome <abbrgrp><abbr bid="B1">1</abbr></abbrgrp> would not have been possible without advanced assembly algorithms. However, owing to the high speed of technological progress and the urgent need for bioinformatics tools, there is a widening gap between state-of-the-art algorithmic techniques and the actual algorithmic components of tools that are in widespread use.</p> <p>Results</p> <p>To remedy this trend we propose the use of SeqAn, a library of efficient data types and algorithms for sequence analysis in computational biology. SeqAn comprises implementations of existing, practical state-of-the-art algorithmic components to provide a sound basis for algorithm testing and development. In this paper we describe the design and content of SeqAn and demonstrate its use by giving two examples. In the first example we show an application of SeqAn as an experimental platform by comparing different exact string matching algorithms. The second example is a simple version of the well-known MUMmer tool rewritten in SeqAn. Results indicate that our implementation is very efficient and versatile to use.</p> <p>Conclusion</p> <p>We anticipate that SeqAn greatly simplifies the rapid development of new bioinformatics tools by providing a collection of readily usable, well-designed algorithmic components which are fundamental for the field of sequence analysis. This leverages not only the implementation of new algorithms, but also enables a sound analysis and comparison of existing algorithms.</p
    corecore