1,247 research outputs found

    Static Analysis of Partial Referential Integrity for Better Quality SQL Data

    Get PDF
    Referential integrity ensures the consistency of data between database relations. The SQL standard proposes different semantics to deal with partial information under referential integrity. Simple semantics neglects tuples with nulls, and enjoys built-in support by commercial database systems. Partial semantics does check tuples with nulls, but does not enjoy built-in support. We investigate this mismatch between the SQL standard and real database systems. Indeed, insight is gained into the trade-off between cleaner data under partial semantics and the efficiency of checking simple semantics. The cost for referential integrity checking is evaluated for various dataset sizes, indexing structures and degrees of cleanliness. While the cost of partial semantics exceeds that of simple semantics, their performance trends follow similar patterns under growing database sizes. Applying multiple index structures and exploiting appropriate validation mechanisms increase the efficiency of checking partial semantics

    UrbanFly: Uncertainty-Aware Planning for Navigation Amongst High-Rises with Monocular Visual-Inertial SLAM Maps

    Full text link
    We present UrbanFly: an uncertainty-aware real-time planning framework for quadrotor navigation in urban high-rise environments. A core aspect of UrbanFly is its ability to robustly plan directly on the sparse point clouds generated by a Monocular Visual Inertial SLAM (VINS) backend. It achieves this by using the sparse point clouds to build an uncertainty-integrated cuboid representation of the environment through a data-driven monocular plane segmentation network. Our chosen world model provides faster distance queries than the more common voxel-grid representation, and UrbanFly leverages this capability in two different ways leading to as many trajectory optimizers. The first optimizer uses a gradient-free cross-entropy method to compute trajectories that minimize collision probability and smoothness cost. Our second optimizer is a simplified version of the first and uses a sequential convex programming optimizer initialized based on probabilistic safety estimates on a set of randomly drawn trajectories. Both our trajectory optimizers are made computationally tractable and independent of the nature of underlying uncertainty by embedding the distribution of collision violations in Reproducing Kernel Hilbert Space. Empowered by the algorithmic innovation, UrbanFly outperforms competing baselines in metrics such as collision rate, trajectory length, etc., on a high fidelity AirSim simulator augmented with synthetic and real-world dataset scenes.Comment: Submitted to IROS 2022, Code available at https://github.com/sudarshan-s-harithas/UrbanFl

    Query Rewriting and Optimization for Ontological Databases

    Full text link
    Ontological queries are evaluated against a knowledge base consisting of an extensional database and an ontology (i.e., a set of logical assertions and constraints which derive new intensional knowledge from the extensional database), rather than directly on the extensional database. The evaluation and optimization of such queries is an intriguing new problem for database research. In this paper, we discuss two important aspects of this problem: query rewriting and query optimization. Query rewriting consists of the compilation of an ontological query into an equivalent first-order query against the underlying extensional database. We present a novel query rewriting algorithm for rather general types of ontological constraints which is well-suited for practical implementations. In particular, we show how a conjunctive query against a knowledge base, expressed using linear and sticky existential rules, that is, members of the recently introduced Datalog+/- family of ontology languages, can be compiled into a union of conjunctive queries (UCQ) against the underlying database. Ontological query optimization, in this context, attempts to improve this rewriting process so to produce possibly small and cost-effective UCQ rewritings for an input query.Comment: arXiv admin note: text overlap with arXiv:1312.5914 by other author

    VRCC-3D+: Qualitative spatial and temporal reasoning in 3 dimensions

    Get PDF
    Qualitative Spatial Reasoning (QSR) has varying applications in Geographic Information Systems (GIS), visual programming language semantics, and digital image analysis. Systems for spatial reasoning over a set of objects have evolved in both expressive power and complexity, but implementations or usages of these systems are not common. This is partially due to the computational complexity of the operations required by the reasoner to make informed decisions about its surroundings. These theoretical systems are designed to focus on certain criteria, including efficiency of computation, ease of human comprehension, and expressive power. Sadly, the implementation of these systems is frequently left as an exercise for the reader. Herein, a new QSR system, VRCC-3D+, is proposed that strives to maximize expressive power while minimizing the complexity of reasoning and computational cost of using the system. This system is an evolution of RCC-3D; the system and implementation are constantly being refined to handle the complexities of the reasoning being performed. The refinements contribute to the accuracy, correctness, and speed of the implementation. To improve the accuracy and correctness of the implementation, a way to dynamically change error tolerance in the system to more accurately reflect what the user sees is designed. A method that improves the speed of determining spatial relationships between objects by using composition tables and decision trees is introduced, and improvements to the system itself are recommended; by streamlining the relation set and enforcing strict rules for the precision of the predicates that determine the relationships between objects. A potential use case and prototype implementation is introduced to further motivate the need for implementations of QSR systems, and show that their use is not precluded by computational complexity. --Abstract, page iv

    Geo-indistinguishability: A Principled Approach to Location Privacy

    Get PDF
    International audienceIn this paper we report on our ongoing project aimed at protecting the privacy of the user when dealing with location-based services. The starting point of our approach is the principle of geo-indistinguishability, a formal notion of privacy that protects the user’s exact location, while allowing approximate information – typically needed to obtain a certain desired service – to be released. We then present two mechanisms for achieving geo-indistinguishability, one generic to sanitize locations in any setting with reasonable utility, the other custom-built for a limited set of locations but providing optimal utility. Finally we extend our mechanisms to the case of location traces, where the user releases his location repeatedly along the day and we provide a method to limit the degradation of the privacy guarantees due to the correlation between the points. All the mechanisms were tested on real datasets and compared both among themselves and with respect to the state of the art in the field
    • …
    corecore