19,921 research outputs found

    Simultaneous multislice acquisition with multi-contrast segmented EPI for separation of signal contributions in dynamic contrast-enhanced imaging

    Get PDF
    We present a method to efficiently separate signal in magnetic resonance imaging (MRI) into a base signal S0, representing the mainly T1-weighted component without T2*-relaxation, and its T2*-weighted counterpart by the rapid acquisition of multiple contrasts for advanced pharmacokinetic modelling. This is achieved by incorporating simultaneous multislice (SMS) imaging into a multi-contrast, segmented echo planar imaging (EPI) sequence to allow extended spatial coverage, which covers larger body regions without time penalty. Simultaneous acquisition of four slices was combined with segmented EPI for fast imaging with three gradient echo times in a preclinical perfusion study. Six female domestic pigs, German-landrace or hybrid-form, were scanned for 11 minutes respectively during administration of gadolinium-based contrast agent. Influences of reconstruction methods and training data were investigated. The separation into T1- and T2*-dependent signal contributions was achieved by fitting a standard analytical model to the acquired multi-echo data. The application of SMS yielded sufficient temporal resolution for the detection of the arterial input function in major vessels, while anatomical coverage allowed perfusion analysis of muscle tissue. The separation of the MR signal into T1- and T2*-dependent components allowed the correction of susceptibility related changes. We demonstrate a novel sequence for dynamic contrast-enhanced MRI that meets the requirements of temporal resolution (Ī”t < 1.5 s) and image quality. The incorporation of SMS into multi-contrast, segmented EPI can overcome existing limitations of dynamic contrast enhancement and dynamic susceptibility contrast methods, when applied separately. The new approach allows both techniques to be combined in a single acquisition with a large spatial coverage

    An open extensible tool environment for Event-B

    No full text
    Abstract. We consider modelling indispensable for the development of complex systems. Modelling must be carried out in a formal notation to reason and make meaningful conjectures about a model. But formal modelling of complex systems is a difficult task. Even when theorem provers improve further and get more powerful, modelling will remain difficult. The reason for this that modelling is an exploratory activity that requires ingenuity in order to arrive at a meaningful model. We are aware that automated theorem provers can discharge most of the onerous trivial proof obligations that appear when modelling systems. In this article we present a modelling tool that seamlessly integrates modelling and proving similar to what is offered today in modern integrated development environments for programming. The tool is extensible and configurable so that it can be adapted more easily to different application domains and development methods.

    Deductive Verification of Unmodified Linux Kernel Library Functions

    Full text link
    This paper presents results from the development and evaluation of a deductive verification benchmark consisting of 26 unmodified Linux kernel library functions implementing conventional memory and string operations. The formal contract of the functions was extracted from their source code and was represented in the form of preconditions and postconditions. The correctness of 23 functions was completely proved using AstraVer toolset, although success for 11 functions was achieved using 2 new specification language constructs. Another 2 functions were proved after a minor modification of their source code, while the final one cannot be completely proved using the existing memory model. The benchmark can be used for the testing and evaluation of deductive verification tools and as a starting point for verifying other parts of the Linux kernel.Comment: 18 pages, 2 tables, 6 listings. Accepted to ISoLA 2018 conference. Evaluating Tools for Software Verification trac

    Single channel nonstationary signal separation using linear time-varying filters

    Get PDF

    Flexible regression models over river networks

    Get PDF
    Many statistical models are available for spatial data but the vast majority of these assume that spatial separation can be measured by Euclidean distance. Data which are collected over river networks constitute a notable and commonly occurring exception, where distance must be measured along complex paths and, in addition, account must be taken of the relative flows of water into and out of confluences. Suitable models for this type of data have been constructed based on covariance functions. The aim of the paper is to place the focus on underlying spatial trends by adopting a regression formulation and using methods which allow smooth but flexible patterns. Specifically, kernel methods and penalized splines are investigated, with the latter proving more suitable from both computational and modelling perspectives. In addition to their use in a purely spatial setting, penalized splines also offer a convenient route to the construction of spatiotemporal models, where data are available over time as well as over space. Models which include main effects and spatiotemporal interactions, as well as seasonal terms and interactions, are constructed for data on nitrate pollution in the River Tweed. The results give valuable insight into the changes in water quality in both space and time
    • ā€¦
    corecore