100 research outputs found

    Punishing apostasy : the case of Islam and Shari'a law re-considered

    Get PDF
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Hash-ssessing the freshness of SPARQL pipelines

    Get PDF
    International audienceThe recent increase of RDF usage has witnessed a rising need of "verification" around data obtained from SPARQL endpoints. It is now possible to deploy Semantic Web pipelines and to adapt them to a wide range of needs and use-cases. Practically, these complex ETL pipelines relying on SPARQL endpoints to extract relevant information often have to be relaunched from scratch every once in a while in order to refresh their data. Such a habit adds load on the network and is heavy resource-wise, while sometimes unnecessary if data remains untouched. In this article, we present a useful method to help data consumers (and pipeline designers) identify when data has been updated in a way that impacts the pipeline's result set. This method is based on standard SPARQL 1.1 features and relies on digitally signing parts of query result sets to inform data consumers about their eventual change

    De-icing federated SPARQL pipelines: a method for assessing the "freshness" of result sets

    Get PDF
    International audienceIn recent years, the ever-increasing number of available linkeddata endpoints has allowed the creation of complex data pipelines leveraging these massive amounts of information. One crucial challenge for federated pipeline designers is to know when to query the various sources they use in order to obtain fresher final results. In other words, they want to know when a data update on a specific source impacts their own final results. Unfortunately, the SPARQL standard does not provide them with a method to be aware of such updates; and therefore pipelines are regularly relaunched from scratch, often uselessly. To help them decide when to get fresher results, we propose a constructive method. Practically, it relies on digitally signing result sets from federated endpoints in order to create a specific query able to warn when, and explain why, the pipeline result set is outdated. In addition, as our solution is exclusively based on SPARQL 1.1 built-in functions, it is fully-compliant with all the endpoints

    Can non-viral technologies knockdown the barriers to siRNA delivery and achieve the next generation of cancer therapeutics?

    Get PDF
    Cancer is one of the most wide-spread diseases of modern times, with an estimated increase in the number of patients diagnosed worldwide, from 11.3 million in 2007 to 15.5 million in 2030 (www.who.int). In many cases, due to the delay in diagnosis and high increase of relapse, survival rates are low. Current therapies, including surgery, radiation and chemotherapy, have made significant progress, but they have many limitations and are far from ideal. Although immunotherapy has recently offered great promise as a new approach in cancer treatment, it is still very much in its infancy and more information on this approach is required before it can be widely applied. For these reasons effective, safe and patient-acceptable cancer therapy is still largely an unmet clinical need. Recent knowledge of the genetic basis of the disease opens up the potential for cancer gene therapeutics based on siRNA. However, the future of such gene-based therapeutics is dependent on achieving successful delivery. Extensive research is ongoing regarding the design and assessment of non-viral delivery technologies for siRNA to treat a wide range of cancers. Preliminary results on the first human Phase I trial for solid tumours, using a targeted non-viral vector, illustrate the enormous therapeutic benefits once the issue of delivery is resolved. In this review the genes regulating cancer will be discussed and potential therapeutic targets will be identified. The physiological and biochemical changes caused by tumours, and the potential to exploit this knowledge to produce bio-responsive ‘smart’ delivery systems, will be evaluated. This review will also provide a critical and comprehensive overview of the different non-viral formulation strategies under investigation for siRNA delivery, with particular emphasis on those designed to exploit the physiological environment of the disease site. In addition, a section of the review will be dedicated to pre-clinical animal models used to evaluate the stability, safety and efficacy of the delivery systems

    The MOUSE approach: Mapping Ontologies using UML for System Engineers

    Get PDF
    To address the problem of semantic heterogeneity, there has been a large body of research directed toward the study of semantic mapping technologies. Although various semantic mapping technologies have been investigated,  facilitating the process for domain experts to perform a semantic data integration task is still not easy. This is because one is required not only to possess domain expertise but also to have a good understanding of knowledge engineering. This paper proposes an approach that automatically transforms an abstract semantic mapping syntax into a concrete executable mapping syntax, we call this approach MOUSE (Mapping Ontologies using UML for System Engineers). In order to evaluate MOUSE, an implementation of this approach for a semantic data integration use case has been developed (called SDI, Semantic Data Integration). The aim is to enable domain experts, particularly system engineers, to undertake mappings using a technology that they are familiar with (UML), while ensuring the created mappings are accurate and the approach is easy to use. The proposed UML-based abstract mapping syntax is evaluated through usability experiments conducted in a lab environment by participants who have skills equivalent to real life system engineers using the SDI tool. Results from the evaluations show that the participants could correctly undertake the semantic data integration task using the MOUSE approach while maintaining accuracy and usability (in terms of ease of use)

    Ontology Based Policy Mobility for Pervasive Computing

    Get PDF
    The array of devices, networks and resources available in pervasive computing environments, or smart spaces, will require effective self-management systems controlled via user-level policies. However, the local nature of smart spaces means that they present a potentially huge increase in the number of and nature of management domains, e.g. representing individual homes, shops, businesses, schools, hospitals etc. However, differences in local domain models and local resource models means that policies relevant to one smart space will often use different semantics for subject and target objects compared to other pervasive computing domains. To allow users to capture personal preferences in terms of policies that can be consistently applied as they roam between smart spaces, the semantic interoperability problem resulting from different models for policy subjects and targets must be overcome. In this paper we present a framework where the use of ontology-based semantics for policy elements allows dynamic ontology mapping capabilities to support policy mobility. We demonstrate its operation with a case study showing policy mobility in a policy-driven smart space management system
    corecore