156,738 research outputs found

    NorthStar, a support tool for the design and evaluation of quality improvement interventions in healthcare

    Get PDF
    Background: The Research-Based Education and Quality Improvement (ReBEQI) European partnership aims to establish a framework and provide practical tools for the selection, implementation, and evaluation of quality improvement (QI) interventions. We describe the development and preliminary evaluation of the software tool NorthStar, a major product of the ReBEQI project. Methods: We focused the content of NorthStar on the design and evaluation of QI interventions. A lead individual from the ReBEQI group drafted each section, and at least two other group members reviewed it. The content is based on published literature, as well as material developed by the ReBEQI group. We developed the software in both a Microsoft Windows HTML help system version and a web-based version. In a preliminary evaluation, we surveyed 33 potential users about the acceptability and perceived utility of NorthStar. Results: NorthStar consists of 18 sections covering the design and evaluation of QI interventions. The major focus of the intervention design sections is on how to identify determinants of practice (factors affecting practice patterns), while the major focus of the intervention evaluation sections is on how to design a cluster randomised trial. The two versions of the software can be transferred by email or CD, and are available for download from the internet. The software offers easy navigation and various functions to access the content. Potential users (55% response rate) reported above-moderate levels of confidence in carrying out QI research related tasks if using NorthStar, particularly when developing a protocol for a cluster randomised trial Conclusion: NorthStar is an integrated, accessible, practical, and acceptable tool to assist developers and evaluators of QI interventions

    On the Evaluation of RDF Distribution Algorithms Implemented over Apache Spark

    Full text link
    Querying very large RDF data sets in an efficient manner requires a sophisticated distribution strategy. Several innovative solutions have recently been proposed for optimizing data distribution with predefined query workloads. This paper presents an in-depth analysis and experimental comparison of five representative and complementary distribution approaches. For achieving fair experimental results, we are using Apache Spark as a common parallel computing framework by rewriting the concerned algorithms using the Spark API. Spark provides guarantees in terms of fault tolerance, high availability and scalability which are essential in such systems. Our different implementations aim to highlight the fundamental implementation-independent characteristics of each approach in terms of data preparation, load balancing, data replication and to some extent to query answering cost and performance. The presented measures are obtained by testing each system on one synthetic and one real-world data set over query workloads with differing characteristics and different partitioning constraints.Comment: 16 pages, 3 figure

    Solutions and Tools for Secure Communication in Wireless Sensor Networks

    Get PDF
    Secure communication is considered a vital requirement in Wireless Sensor Network (WSN) applications. Such a requirement embraces different aspects, including confidentiality, integrity and authenticity of exchanged information, proper management of security material, and effective prevention and reaction against security threats and attacks. However, WSNs are mainly composed of resource-constrained devices. That is, network nodes feature reduced capabilities, especially in terms of memory storage, computing power, transmission rate, and energy availability. As a consequence, assuring secure communication in WSNs results to be more difficult than in other kinds of network. In fact, trading effectiveness of adopted solutions with their efficiency becomes far more important. In addition, specific device classes or technologies may require to design ad hoc security solutions. Also, it is necessary to efficiently manage security material, and dynamically cope with changes of security requirements. Finally, security threats and countermeasures have to be carefully considered since from the network design phase. This Ph.D. dissertion considers secure communication in WSNs, and provides the following contributions. First, we provide a performance evaluation of IEEE 802.15.4 security services. Then, we focus on the ZigBee technology and its security services, and propose possible solutions to some deficiencies and inefficiencies. Second, we present HISS, a highly scalable and efficient key management scheme, able to contrast collusion attacks while displaying a graceful degradation of performance. Third, we present STaR, a software component for WSNs that secures multiple traffic flows at the same time. It is transparent to the application, and provides runtime reconfigurability, thus coping with dynamic changes of security requirements. Finally, we describe ASF, our attack simulation framework for WSNs. Such a tool helps network designers to quantitatively evaluate effects of security attacks, produce an attack ranking based on their severity, and thus select the most appropriate countermeasures

    Value innovation modelling: Design thinking as a tool for business analysis and strategy

    Get PDF
    This paper explores the use of multiple perspective problem framing (English 2008) as a tool to reveal hidden value and commercial opportunity for business. Creative thinking involves the interrelationship of parameters held open and fluid within the cognitive span of the creative mind. The recognition of new associations can create new value that can lead to innovation in designed products, intellectual property and business strategy. The ‘Ideas-lab’ process is based on the proposition that a company’s capacity for innovation is dependent on the way the business is able to see its problems and opportunities. In this process the attributes of a company and the experience of the researchers are considered as the parameters of a design problem. It is therefore important to acknowledge the commercial experience of the project researchers, all of whom have a proven track record in helping businesses develop, exploit and protect their know how. Semi structured interviews were carried out with key individuals in 34 companies. The resulting data was assessed on a company-by-company basis through a process of multiple perspective problem framing, enabling key nodes, patterns and relationships to be identified and explored. A ‘Cornerstones of Innovation’ report was prepared to inform each company of the observations made by the researchers. The paper describes the methods adopted and summarises the feedback from participating companies. Case studies are highlighted to demonstrate ways in which the process influenced the actions of particular businesses, and the commercial outcomes that resulted. Finally the researchers reflect on the structure of the Ideas-lab process

    Representing Dataset Quality Metadata using Multi-Dimensional Views

    Full text link
    Data quality is commonly defined as fitness for use. The problem of identifying quality of data is faced by many data consumers. Data publishers often do not have the means to identify quality problems in their data. To make the task for both stakeholders easier, we have developed the Dataset Quality Ontology (daQ). daQ is a core vocabulary for representing the results of quality benchmarking of a linked dataset. It represents quality metadata as multi-dimensional and statistical observations using the Data Cube vocabulary. Quality metadata are organised as a self-contained graph, which can, e.g., be embedded into linked open datasets. We discuss the design considerations, give examples for extending daQ by custom quality metrics, and present use cases such as analysing data versions, browsing datasets by quality, and link identification. We finally discuss how data cube visualisation tools enable data publishers and consumers to analyse better the quality of their data.Comment: Preprint of a paper submitted to the forthcoming SEMANTiCS 2014, 4-5 September 2014, Leipzig, German
    corecore