20,393 research outputs found

    What Am I Testing and Where? Comparing Testing Procedures based on Lightweight Requirements Annotations

    Get PDF
    [Context] The testing of software-intensive systems is performed in different test stages each having a large number of test cases. These test cases are commonly derived from requirements. Each test stages exhibits specific demands and constraints with respect to their degree of detail and what can be tested. Therefore, specific test suites are defined for each test stage. In this paper, the focus is on the domain of embedded systems, where, among others, typical test stages are Software- and Hardware-in-the-loop. [Objective] Monitoring and controlling which requirements are verified in which detail and in which test stage is a challenge for engineers. However, this information is necessary to assure a certain test coverage, to minimize redundant testing procedures, and to avoid inconsistencies between test stages. In addition, engineers are reluctant to state their requirements in terms of structured languages or models that would facilitate the relation of requirements to test executions. [Method] With our approach, we close the gap between requirements specifications and test executions. Previously, we have proposed a lightweight markup language for requirements which provides a set of annotations that can be applied to natural language requirements. The annotations are mapped to events and signals in test executions. As a result, meaningful insights from a set of test executions can be directly related to artifacts in the requirements specification. In this paper, we use the markup language to compare different test stages with one another. [Results] We annotate 443 natural language requirements of a driver assistance system with the means of our lightweight markup language. The annotations are then linked to 1300 test executions from a simulation environment and 53 test executions from test drives with human drivers. Based on the annotations, we are able to analyze how similar the test stages are and how well test stages and test cases are aligned with the requirements. Further, we highlight the general applicability of our approach through this extensive experimental evaluation. [Conclusion] With our approach, the results of several test levels are linked to the requirements and enable the evaluation of complex test executions. By this means, practitioners can easily evaluate how well a systems performs with regards to its specification and, additionally, can reason about the expressiveness of the applied test stage.TU Berlin, Open-Access-Mittel - 202

    Medical data processing and analysis for remote health and activities monitoring

    Get PDF
    Recent developments in sensor technology, wearable computing, Internet of Things (IoT), and wireless communication have given rise to research in ubiquitous healthcare and remote monitoring of human\u2019s health and activities. Health monitoring systems involve processing and analysis of data retrieved from smartphones, smart watches, smart bracelets, as well as various sensors and wearable devices. Such systems enable continuous monitoring of patients psychological and health conditions by sensing and transmitting measurements such as heart rate, electrocardiogram, body temperature, respiratory rate, chest sounds, or blood pressure. Pervasive healthcare, as a relevant application domain in this context, aims at revolutionizing the delivery of medical services through a medical assistive environment and facilitates the independent living of patients. In this chapter, we discuss (1) data collection, fusion, ownership and privacy issues; (2) models, technologies and solutions for medical data processing and analysis; (3) big medical data analytics for remote health monitoring; (4) research challenges and opportunities in medical data analytics; (5) examples of case studies and practical solutions

    Working Waterfronts in RI

    Get PDF

    PriCL: Creating a Precedent A Framework for Reasoning about Privacy Case Law

    Full text link
    We introduce PriCL: the first framework for expressing and automatically reasoning about privacy case law by means of precedent. PriCL is parametric in an underlying logic for expressing world properties, and provides support for court decisions, their justification, the circumstances in which the justification applies as well as court hierarchies. Moreover, the framework offers a tight connection between privacy case law and the notion of norms that underlies existing rule-based privacy research. In terms of automation, we identify the major reasoning tasks for privacy cases such as deducing legal permissions or extracting norms. For solving these tasks, we provide generic algorithms that have particularly efficient realizations within an expressive underlying logic. Finally, we derive a definition of deducibility based on legal concepts and subsequently propose an equivalent characterization in terms of logic satisfiability.Comment: Extended versio

    National Mesothelioma Virtual Bank: A standard based biospecimen and clinical data resource to enhance translational research

    Get PDF
    Background: Advances in translational research have led to the need for well characterized biospecimens for research. The National Mesothelioma Virtual Bank is an initiative which collects annotated datasets relevant to human mesothelioma to develop an enterprising biospecimen resource to fulfill researchers' need. Methods: The National Mesothelioma Virtual Bank architecture is based on three major components: (a) common data elements (based on College of American Pathologists protocol and National North American Association of Central Cancer Registries standards), (b) clinical and epidemiologic data annotation, and (c) data query tools. These tools work interoperably to standardize the entire process of annotation. The National Mesothelioma Virtual Bank tool is based upon the caTISSUE Clinical Annotation Engine, developed by the University of Pittsburgh in cooperation with the Cancer Biomedical Informatics Grid™ (caBIG™, see http://cabig.nci.nih.gov). This application provides a web-based system for annotating, importing and searching mesothelioma cases. The underlying information model is constructed utilizing Unified Modeling Language class diagrams, hierarchical relationships and Enterprise Architect software. Result: The database provides researchers real-time access to richly annotated specimens and integral information related to mesothelioma. The data disclosed is tightly regulated depending upon users' authorization and depending on the participating institute that is amenable to the local Institutional Review Board and regulation committee reviews. Conclusion: The National Mesothelioma Virtual Bank currently has over 600 annotated cases available for researchers that include paraffin embedded tissues, tissue microarrays, serum and genomic DNA. The National Mesothelioma Virtual Bank is a virtual biospecimen registry with robust translational biomedical informatics support to facilitate basic science, clinical, and translational research. Furthermore, it protects patient privacy by disclosing only de-identified datasets to assure that biospecimens can be made accessible to researchers. © 2008 Amin et al; licensee BioMed Central Ltd

    Auditing business process compliance

    Get PDF
    Compliance issues impose significant management and reporting requirements upon organizations.We present an approach to enhance business process modeling notations with the capability to detect and resolve many broad compliance related issues. We provide a semantic characterization of a minimal revision strategy that helps us obtain compliant process models from models that might be initially non-compliant, in a manner that accommodates the structural and semantic dimensions of parsimoniously annotated process models. We also provide a heuristic approach to compliance resolution using a notion of compliance patterns. This allows us to partially automate compliance resolution, leading to reduced levels of analyst involvement and improved decision support

    Halogen occultation experiment intergrated test plan

    Get PDF
    The test program plan is presented for the Halogen Occultation Experiment (HALOE) instrument, which is being developed in-house at the Langley Research Center for the Upper Atmosphere Research Satellite (UARS). This comprehensive test program was developed to demonstrate that the HALOE instrument meets its performance requirements and maintains integrity through UARS flight environments. Each component, subsystem, and system level test is described in sufficient detail to allow development of the necessary test setups and test procedures. Additionally, the management system for implementing this test program is given. The HALOE instrument is a gas correlation radiometer that measures vertical distribution of eight upper atmospheric constituents: O3, HC1, HF, NO, CH4, H2O, NO2, and CO2

    Model-driven design, simulation and implementation of service compositions in COSMO

    Get PDF
    The success of software development projects to a large extent depends on the quality of the models that are produced in the development process, which in turn depends on the conceptual and practical support that is available for modelling, design and analysis. This paper focuses on model-driven support for service-oriented software development. In particular, it addresses how services and compositions of services can be designed, simulated and implemented. The support presented is part of a larger framework, called COSMO (COnceptual Service MOdelling). Whereas in previous work we reported on the conceptual support provided by COSMO, in this paper we proceed with a discussion of the practical support that has been developed. We show how reference models (model types) and guidelines (design steps) can be iteratively applied to design service compositions at a platform independent level and discuss what tool support is available for the design and analysis during this phase. Next, we present some techniques to transform a platform independent service composition model to an implementation in terms of BPEL and WSDL. We use the mediation scenario of the SWS challenge (concerning the establishment of a purchase order between two companies) to illustrate our application of the COSMO framework
    • …
    corecore