50 research outputs found

    Distributed key man-agement in dynamic outsourced databases: A trie-based approach

    Get PDF
    Abstract The decision to outsource databases is strategic in many organizations due to the increasing costs of internally managing large volumes of information. The sensitive nature of this information raises the need for powerful mechanisms to protect it against unauthorized disclosure. Centralized encryption to access control at the data owner level has been proposed as one way of handling this issue. However, its prohibitive costs renders it impractical and inflexible. A distributed cryptographic approach has been suggested as a promising alternative, where keys are distributed to users on the basis of their assigned privileges. But in this case, key management becomes problematic in the face of frequent database updates and remains an open issue. In this paper, we present a novel approach based on Binary Tries 1 . By exploiting the intrinsic properties of these data structures, key management complexity, and thus its cost, is significantly reduced. Changes to the Binary Trie structure remain limited in the face of frequent updates. Preliminary experimental analysis demonstrates the validity and the effectiveness of our approach

    LinkEHR-Ed: A multi-reference model archetype editor based on formal semantics

    Full text link
    Purpose To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. Methods The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. Results LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. Conclusions LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.This work was supported in part by the Spanish Ministry of Education and Science under grant TS12007-66S7S-C02; by the Generalitat Valenciana under grant APOSTD/2007/055 and by the program PAID-06-07 de la Universidad Politecnica de Valencia.Maldonado Segura, JA.; Moner Cano, D.; Boscá Tomás, D.; Fernandez Breis, JT.; Angulo Fernández, C.; Robles Viejo, M. (2009). LinkEHR-Ed: A multi-reference model archetype editor based on formal semantics. International Journal of Medical Informatics. 78(8):559-570. https://doi.org/10.1016/j.ijmedinf.2009.03.006S55957078

    P2P, ad hoc and sensor networks – All the different or all the same?

    Get PDF
    Currently, data management technologies are in the process of finding their way into evolving networks, i.e. P2P, ad hoc and wireless sensor networks. We examine the properties, differences and commonalities of the different types of evolving networks, in order to enable the development of adequate technologies suiting their characteristics. We start with presenting definitions for the different network types, before arranging them in a network hierarchy, to gain a clear view of the area. Then, we analyze and compare the example applications for each of the types using different design dimensions. Based on this work, we finally present a comparison of P2P, ad hoc and wireless sensor networks

    Multidimensional access methods

    Full text link

    Reducing Redundant Data Transmissions in Wireless Ad Hoc Networks: Comparing Aggregation and Filtering 1

    No full text
    Abstract—Efficient bandwidth usage is vital for real-time ad hoc networking applications like vehicular safety. Yet, such applications can produce large amounts of identical data. Pruning redundant data transmissions can enable delivering richer data to more users at shorter intervals. Reducing redundancy has been studied extensively for stable network topologies but the solutions are not directly extensible to dynamic topologies where information about network state obsolesces quickly. We compare two novel combinations of the adaptive controlled flooding routing protocol, SBSD, with implementations of response aggregation and query filtering for mobile environments. We test these combinations in simulated vehicular networks. We show that, even in cases where response aggregation only slightly improves network performance, query filtering can improve delivery by up to 30% and response time by 75%
    corecore