61 research outputs found

    Scenario-based system architecting : a systematic approach to developing future-proof system architectures

    Get PDF
    This thesis summarizes the research results of Mugurel T. Ionita, based on the work conducted in the context of the STW15 - AIMES16 project. The work presented in this thesis was conducted at Philips Research and coordinated by Eindhoven University of Technology. It resulted in six external available publications, and ten internal reports which are company confidential. The research regarded the methodology of developing system architectures, focusing in particular on two aspects of the early architecting phases. These were, first the generation of multiple architectural options, to consider the most likely changes to appear in the business environment, and second the quantitative assessment of these options with respect to how well they contribute to the overall quality attributes of the future system, including cost and risk analysis. The main reasons for looking at these two aspects of the architecting process was because architectures usually have to live for long periods of time, up to 5 years, which requires that they are able to deal successfully with the uncertainty associated with the future business environment. A second reason was because the quality attributes, the costs and the risks of a future system are usually dictated by its architecture, and therefore an early quantitative estimate about these attributes could prevent the system redesign. The research results of this project were two methods, namely a method for designing architecture options that are more future-proof, meaning more resilient to future changes, (SODA method), and within SODA a method for the quantitative assessment of the proposed architectural options (SQUASH method). The validation of the two methods has been performed in the area of professional systems, where they were applied in a concrete case study from the medical domain. The SODA method is an innovative solution to the problem of developing system architectures that are designed to survive the most likely changes to be foreseen in the future business environment of the system. The method enables on one hand the business stakeholders of a system to provide the architects with their knowledge and insight about the future when new systems are created. And on the other hand, the method enables the architects to take a long view and think strategically in terms of different plausible futures and unexpected surprises, when designing the high level structure of their systems. The SQUASH method is a systematic way of assessing in a quantitative manner, the proposed architectural options, with respect to how well they deal with quality aspects, costs and risks, before the architecture is actually implemented. The method enables the architects to reason about the most relevant attributes of the future system, and to make more informed decisions about their design, based on the quantitative data. Both methods, SODA and SQUASH, are descriptive in nature, rooted in the best industrial practices, and hence proposing better ways of developing system architectures

    Nonfluoroscopic electromechanical mapping of the left ventricle

    Get PDF

    Quantitative computational evaluation of cardiac and coronary physiology

    Get PDF

    Nonfluoroscopic electromechanical mapping of the left ventricle: Evaluation of the technique as diagnostic tool and as guidance for novel therapeutic strategies

    Get PDF
    With his landmark paper in Nature Medicine in 1996, Shlomo Ben-Baim and coworkers introduced a novel technique into the clinical arena. In

    Nonfluoroscopic electromechanical mapping of the left ventricle

    Get PDF

    A physics-based machine learning technique rapidly reconstructs the wall-shear stress and pressure fields in coronary arteries

    Get PDF
    With the global rise of cardiovascular disease including atherosclerosis, there is a high demand for accurate diagnostic tools that can be used during a short consultation. In view of pathology, abnormal blood flow patterns have been demonstrated to be strong predictors of atherosclerotic lesion incidence, location, progression, and rupture. Prediction of patient-specific blood flow patterns can hence enable fast clinical diagnosis. However, the current state of art for the technique is by employing 3D-imaging-based Computational Fluid Dynamics (CFD). The high computational cost renders these methods impractical. In this work, we present a novel method to expedite the reconstruction of 3D pressure and shear stress fields using a combination of a reduced-order CFD modelling technique together with non-linear regression tools from the Machine Learning (ML) paradigm. Specifically, we develop a proof-of-concept automated pipeline that uses randomised perturbations of an atherosclerotic pig coronary artery to produce a large dataset of unique mesh geometries with variable blood flow. A total of 1,407 geometries were generated from seven reference arteries and were used to simulate blood flow using the CFD solver Abaqus. This CFD dataset was then post-processed using the mesh-domain common-base Proper Orthogonal Decomposition (cPOD) method to obtain Eigen functions and principal coefficients, the latter of which is a product of the individual mesh flow solutions with the POD Eigenvectors. Being a data-reduction method, the POD enables the data to be represented using only the ten most significant modes, which captures cumulatively greater than 95% of variance of flow features due to mesh variations. Next, the node coordinate data of the meshes were embedded in a two-dimensional coordinate system using the t-distributed Stochastic Neighbor Embedding ((Formula presented.) -SNE) algorithm. The reduced dataset for (Formula presented.) -SNE coordinates and corresponding vector of POD coefficients were then used to train a Random Forest Regressor (RFR) model. The same methodology was applied to both the volumetric pressure solution and the wall shear stress. The predicted pattern of blood pressure, and shear stress in unseen arterial geometries were compared with the ground truth CFD solutions on “unseen” meshes. The new method was able to reliably reproduce the 3D coronary artery haemodynamics in less than 10 s

    Methods to assess the coronary circulation by guidewire-mounted sensors

    Get PDF

    Quantitative 3·D Echocardiography of The Heart and The Coronary Vessels

    Get PDF
    The recognition of the existence of ultrasound is credited to L. Spallanzani (1729- 1799). In recent years, ultrasound has been used as an imaging modality in medicine. I. Edler and C.H. Hertz produced the first ultrasound images of the heart in 1953. In the 1960's great progress was made in the clinical application of ultrasound when real-time two-dimensional ultrasound scanners were developed. In 1968, J. Somer constructed the first electronic phased-array scanner and this technology is still the most widely used in ultrasound equipment. In 1974 F.E. Barber and colleagues produced a duplex scanner which integrated imaging with pulsed-wave Doppler measurements. C. Kasai and colleagues constmcted in 1982 the color-coded Doppler flow imaging system based on autocorrelation detection, providing a noninvasive "angiogram" simulation of normal and abnormal blood flow on a "beat-to-beat" basis. Transesophageal echocardiography became available to clinicians in 1985 due to the developments of 1. Soquet who invented the mono- and biplane electronic phased-array probel Echocardiography has become one of the most commonly used diagnostic imaging techniques in cardiology. The development of commercial 3-D echocardiographic equipment began in the early 1990's. In 1993 a technique allowing acquisition of tomographic parallel sliced data set of echocardiographic images of the heart with a lobster tail TEE probe, was 2 developed by the German based company "TomTec GmbH". The TEE probe had an imaging element which could be controlled by computer applying a stepping motor. They also developed an interface to the patient to record the respiration and R-R intervals. This allowed the acquisition of ultrasound images ECG-triggered and gated, which reduced motion artifacts caused by beat-to-beat and respiratory variations in cardiac dimensions and position. After the acquisition of a tomographic data set, the images were post-processed and with application of software interpolation algorithms, gaps in the data set could be filled. This post-processed data set could then be used to reconstruct 3-D volume rendered images of the heart. 3-D ultrasound provides cardiac images which more closely mimic actual anatomy'than 2-D cross-sectional linages, and may thus be easier to interpret

    Angiographic Applications for Modern Percutaneous Coronary Intervention

    Get PDF
    This thesis sought to explore contemporary applications of invasive coronary angiography in the era of advanced percutaneous coronary intervention. Firstly, it describes the development and validation of dedicated bifurcation quantitative coronary angiography algorithms, in order to facilitate their analysis in a harmonized, reliable and reproducible manner. Then it presents the use of bifurcation quantitative coronary angiography algorithms in clinical studies, in the context of large registries and randomized trials, and discusses the clinical relevance of angiographic measures. Finally, it explores the prognostic value of angiographic scoring sys
    • …
    corecore