226 research outputs found

    Locked and Unlocked Polygonal Chains in 3D

    Get PDF
    In this paper, we study movements of simple polygonal chains in 3D. We say that an open, simple polygonal chain can be straightened if it can be continuously reconfigured to a straight sequence of segments in such a manner that both the length of each link and the simplicity of the chain are maintained throughout the movement. The analogous concept for closed chains is convexification: reconfiguration to a planar convex polygon. Chains that cannot be straightened or convexified are called locked. While there are open chains in 3D that are locked, we show that if an open chain has a simple orthogonal projection onto some plane, it can be straightened. For closed chains, we show that there are unknotted but locked closed chains, and we provide an algorithm for convexifying a planar simple polygon in 3D with a polynomial number of moves.Comment: To appear in Proc. 10th ACM-SIAM Sympos. Discrete Algorithms, Jan. 199

    Fast Locality-Sensitive Hashing Frameworks for Approximate Near Neighbor Search

    Full text link
    The Indyk-Motwani Locality-Sensitive Hashing (LSH) framework (STOC 1998) is a general technique for constructing a data structure to answer approximate near neighbor queries by using a distribution H\mathcal{H} over locality-sensitive hash functions that partition space. For a collection of nn points, after preprocessing, the query time is dominated by O(nρlogn)O(n^{\rho} \log n) evaluations of hash functions from H\mathcal{H} and O(nρ)O(n^{\rho}) hash table lookups and distance computations where ρ(0,1)\rho \in (0,1) is determined by the locality-sensitivity properties of H\mathcal{H}. It follows from a recent result by Dahlgaard et al. (FOCS 2017) that the number of locality-sensitive hash functions can be reduced to O(log2n)O(\log^2 n), leaving the query time to be dominated by O(nρ)O(n^{\rho}) distance computations and O(nρlogn)O(n^{\rho} \log n) additional word-RAM operations. We state this result as a general framework and provide a simpler analysis showing that the number of lookups and distance computations closely match the Indyk-Motwani framework, making it a viable replacement in practice. Using ideas from another locality-sensitive hashing framework by Andoni and Indyk (SODA 2006) we are able to reduce the number of additional word-RAM operations to O(nρ)O(n^\rho).Comment: 15 pages, 3 figure

    Locked and Unlocked Polygonal Chains in Three Dimensions

    Get PDF
    This paper studies movements of polygonal chains in three dimensions whose links are not allowed to cross or change length. Our main result is an algorithmic proof that any simple closed chain that initially takes the form of a planar polygon can be made convex in three dimensions. Other results include an algorithm for straightening open chains having a simple orthogonal projection onto some plane, and an algorithm for making convex any open chain initially configured on the surface of a polytope. All our algorithms require only O (n) basic moves.

    Algorithms for Stable Matching and Clustering in a Grid

    Full text link
    We study a discrete version of a geometric stable marriage problem originally proposed in a continuous setting by Hoffman, Holroyd, and Peres, in which points in the plane are stably matched to cluster centers, as prioritized by their distances, so that each cluster center is apportioned a set of points of equal area. We show that, for a discretization of the problem to an n×nn\times n grid of pixels with kk centers, the problem can be solved in time O(n2log5n)O(n^2 \log^5 n), and we experiment with two slower but more practical algorithms and a hybrid method that switches from one of these algorithms to the other to gain greater efficiency than either algorithm alone. We also show how to combine geometric stable matchings with a kk-means clustering algorithm, so as to provide a geometric political-districting algorithm that views distance in economic terms, and we experiment with weighted versions of stable kk-means in order to improve the connectivity of the resulting clusters.Comment: 23 pages, 12 figures. To appear (without the appendices) at the 18th International Workshop on Combinatorial Image Analysis, June 19-21, 2017, Plovdiv, Bulgari

    Low-cost fluorescence microscope with microfluidic device fabrication for optofluidic applications

    Full text link
    Optofluidic devices have revolutionized the manipulation and transportation of fluid at smaller length scales ranging from micrometers to millimeters. We describe a dedicated optical setup for studying laser-induced cavitation inside a microchannel. In a typical experiment, we use a tightly focused laser beam to locally evaporate the solution laced with a dye resulting in the formation of a microbubble. The evolving bubble interface is tracked using high-speed microscopy and digital image analysis. Furthermore, we extend this system to analyze fluid flow through fluorescence-Particle Image Velocimetry (PIV) technique with minimal adaptations. In addition, we demonstrate the protocols for the in-house fabrication of a microchannel tailored to function as a sample holder in this optical setup. In essence, we present a complete guide for constructing a fluorescence microscope from scratch using standard optical components with flexibility in the design and at a lower cost compared to its commercial analogues.Comment: N. Nagalingam and A. Raghunathan contributed equally to this wor

    Testing for Network and Spatial Autocorrelation

    Full text link
    Testing for dependence has been a well-established component of spatial statistical analyses for decades. In particular, several popular test statistics have desirable properties for testing for the presence of spatial autocorrelation in continuous variables. In this paper we propose two contributions to the literature on tests for autocorrelation. First, we propose a new test for autocorrelation in categorical variables. While some methods currently exist for assessing spatial autocorrelation in categorical variables, the most popular method is unwieldy, somewhat ad hoc, and fails to provide grounds for a single omnibus test. Second, we discuss the importance of testing for autocorrelation in data sampled from the nodes of a network, motivated by social network applications. We demonstrate that our proposed statistic for categorical variables can both be used in the spatial and network setting

    Parallel Write-Efficient Algorithms and Data Structures for Computational Geometry

    Full text link
    In this paper, we design parallel write-efficient geometric algorithms that perform asymptotically fewer writes than standard algorithms for the same problem. This is motivated by emerging non-volatile memory technologies with read performance being close to that of random access memory but writes being significantly more expensive in terms of energy and latency. We design algorithms for planar Delaunay triangulation, kk-d trees, and static and dynamic augmented trees. Our algorithms are designed in the recently introduced Asymmetric Nested-Parallel Model, which captures the parallel setting in which there is a small symmetric memory where reads and writes are unit cost as well as a large asymmetric memory where writes are ω\omega times more expensive than reads. In designing these algorithms, we introduce several techniques for obtaining write-efficiency, including DAG tracing, prefix doubling, reconstruction-based rebalancing and α\alpha-labeling, which we believe will be useful for designing other parallel write-efficient algorithms

    Developing serious games for cultural heritage: a state-of-the-art review

    Get PDF
    Although the widespread use of gaming for leisure purposes has been well documented, the use of games to support cultural heritage purposes, such as historical teaching and learning, or for enhancing museum visits, has been less well considered. The state-of-the-art in serious game technology is identical to that of the state-of-the-art in entertainment games technology. As a result, the field of serious heritage games concerns itself with recent advances in computer games, real-time computer graphics, virtual and augmented reality and artificial intelligence. On the other hand, the main strengths of serious gaming applications may be generalised as being in the areas of communication, visual expression of information, collaboration mechanisms, interactivity and entertainment. In this report, we will focus on the state-of-the-art with respect to the theories, methods and technologies used in serious heritage games. We provide an overview of existing literature of relevance to the domain, discuss the strengths and weaknesses of the described methods and point out unsolved problems and challenges. In addition, several case studies illustrating the application of methods and technologies used in cultural heritage are presented

    Biophysical suitability, economic pressure and land-cover change: a global probabilistic approach and insights for REDD+

    Get PDF
    There has been a concerted effort by the international scientific community to understand the multiple causes and patterns of land-cover change to support sustainable land management. Here, we examined biophysical suitability, and a novel integrated index of “Economic Pressure on Land” (EPL) to explain land cover in the year 2000, and estimated the likelihood of future land-cover change through 2050, including protected area effectiveness. Biophysical suitability and EPL explained almost half of the global pattern of land cover (R 2 = 0.45), increasing to almost two-thirds in areas where a long-term equilibrium is likely to have been reached (e.g. R 2 = 0.64 in Europe). We identify a high likelihood of future land-cover change in vast areas with relatively lower current and past deforestation (e.g. the Congo Basin). Further, we simulated emissions arising from a “business as usual” and two reducing emissions from deforestation and forest degradation (REDD) scenarios by incorporating data on biomass carbon. As our model incorporates all biome types, it highlights a crucial aspect of the ongoing REDD + debate: if restricted to forests, “cross-biome leakage” would severely reduce REDD + effectiveness for climate change mitigation. If forests were protected from deforestation yet without measures to tackle the drivers of land-cover change, REDD + would only reduce 30 % of total emissions from land-cover change. Fifty-five percent of emissions reductions from forests would be compensated by increased emissions in other biomes. These results suggest that, although REDD + remains a very promising mitigation tool, implementation of complementary measures to reduce land demand is necessary to prevent this leakage

    Compressed Indexes for String Searching in Labeled Graphs

    Get PDF
    Storing and searching large labeled graphs is indeed becoming a key issue in the design of space/time efficient online platforms indexing modern social networks or knowledge graphs. But, as far as we know, all these results are limited to design compressed graph indexes which support basic access operations onto the link structure of the input graph, such as: given a node u, return the adjacency list of u. This paper takes inspiration from the Facebook Unicorn's platform and proposes some compressed-indexing schemes for large graphs whose nodes are labeled with strings of variable length - i.e., node's attributes such as user's (nick-)name - that support sophisticated search operations which involve both the linked structure of the graph and the string content of its nodes. An extensive experimental evaluation over real social networks will show the time and space efficiency of the proposed indexing schemes and their query processing algorithms
    corecore