270 research outputs found

    Terminology Services: Standard Terminologies to Control Medical Vocabulary. “Words are Not What they Say but What they Mean”

    Get PDF
    Data entry is an obstacle for the usability of electronic health records (EHR) applications and the acceptance of physicians, who prefer to document using “free text”. Natural language is huge and very rich in details but at the same time is ambiguous; it has great dependence on context and uses jargon and acronyms. Healthcare Information Systems should capture clinical data in a structured and preferably coded format. This is crucial for data exchange between health information systems, epidemiological analysis, quality and research, clinical decision support systems, administrative functions, etc. In order to address this point, numerous terminological systems for the systematic recording of clinical data have been developed. These systems interrelate concepts of a particular domain and provide reference to related terms and possible definitions and codes. The purpose of terminology services consists of representing facts that happen in the real world through database management. This process is named Semantic Interoperability. It implies that different systems understand the information they are processing through the use of codes of clinical terminologies. Standard terminologies allow controlling medical vocabulary. But how do we do this? What do we need? Terminology services are a fundamental piece for health data management in health environment

    Model checking: Algorithmic verification and debugging

    Get PDF
    Turing Lecture from the winners of the 2007 ACM A.M. Turing Award.In 1981, Edmund M. Clarke and E. Allen Emerson, working in the USA, and Joseph Sifakis working independently in France, authored seminal papers that founded what has become the highly successful field of model checking. This verification technology provides an algorithmic means of determining whether an abstract model-representing, for example, a hardware or software design-satisfies a formal specification expressed as a temporal logic (TL) formula. Moreover, if the property does not hold, the method identifies a counterexample execution that shows the source of the problem.The progression of model checking to the point where it can be successfully used for complex systems has required the development of sophisticated means of coping with what is known as the state explosion problem. Great strides have been made on this problem over the past 28 years by what is now a very large international research community. As a result many major hardware and software companies are beginning to use model checking in practice. Examples of its use include the verification of VLSI circuits, communication protocols, software device drivers, real-time embedded systems, and security algorithms.The work of Clarke, Emerson, and Sifakis continues to be central to the success of this research area. Their work over the years has led to the creation of new logics for specification, new verification algorithms, and surprising theoretical results. Model checking tools, created by both academic and industrial teams, have resulted in an entirely novel approach to verification and test case generation. This approach, for example, often enables engineers in the electronics industry to design complex systems with considerable assurance regarding the correctness of their initial designs. Model checking promises to have an even greater impact on the hardware and software industries in the future.-Moshe Y. Vardi, Editor-in-Chief

    Shared neural correlates for building phrases in signed and spoken language

    Get PDF
    Abstract Research on the mental representation of human language has convincingly shown that sign languages are structured similarly to spoken languages. However, whether the same neurobiology underlies the online construction of complex linguistic structures in sign and speech remains unknown. To investigate this question with maximally controlled stimuli, we studied the production of minimal two-word phrases in sign and speech. Signers and speakers viewed the same pictures during magnetoencephalography recording and named them with semantically identical expressions. For both signers and speakers, phrase building engaged left anterior temporal and ventromedial cortices with similar timing, despite different linguistic articulators. Thus the neurobiological similarity of sign and speech goes beyond gross measures such as lateralization: the same fronto-temporal network achieves the planning of structured linguistic expressions

    Separation of Concerns in Feature Modeling: Support and Applications

    Get PDF
    International audienceFeature models (FMs) are a popular formalism for describing the commonality and variability of software product lines (SPLs) in terms of features. SPL development increasingly involves manipulating many large FMs, and thus scalable modular techniques that support compositional development of complex SPLs are required. In this paper, we describe how a set of complementary operators (aggregate, merge, slice) provides practical support for separation of concerns in feature modeling. We show how the combination of these operators can assist in tedious and error prone tasks such as automated correction of FM anomalies, update and extraction of FM views, reconciliation of FMs and reasoning about properties of FMs. For each task, we report on practical applications in different domains. We also present a technique that can efficiently decompose FMs with thousands of features and report our experimental results

    Formal development of control software in the medical systems domain

    Get PDF
    In this thesis we describe the effectiveness of applying a number of formal techniques to the development of industrial control software at Philips Healthcare. We demonstrate how these techniques were tightly incorporated to the industrial workflow and the issues encountered during the application. The work was established in an industrial context, dealing with real industrial projects and a real product concerning the development of interventional X-ray systems. The results are very conclusive in the sense that the used formal techniques could deliver substantially better quality code compared to the code developed in conventional development methods. Also, the results show that the productivity of the formally developed code is better than the productivity of code developed by projects at Philips Healthcare or projects reported worldwide. The thesis also includes a number of design and specification guidelines that assist constructing verifiable components using model checking. The guidelines were successful in designing and verifying a controller component developed at Philips Healthcare. Hence, the guidelines can provide an effective framework to design verifiable control components in industrial settings

    Abstractions and Static Analysis for Verifying Reactive Systems

    Get PDF
    Fokkink, W.J. [Promotor]Sidorova, N. [Copromotor

    Proceedings of the 21st Conference on Formal Methods in Computer-Aided Design – FMCAD 2021

    Get PDF
    The Conference on Formal Methods in Computer-Aided Design (FMCAD) is an annual conference on the theory and applications of formal methods in hardware and system verification. FMCAD provides a leading forum to researchers in academia and industry for presenting and discussing groundbreaking methods, technologies, theoretical results, and tools for reasoning formally about computing systems. FMCAD covers formal aspects of computer-aided system design including verification, specification, synthesis, and testing
    • 

    corecore