666 research outputs found

    Introduction to hemodynamic forces by echocardiography

    Get PDF
    Hemodynamic force (HDF) analysis represents a novel approach to quantify intraventricular pressure gradients, responsible for blood flow. A new mathematical model allows the derivation of HDF parameters from routine transthoracic echocardiography, making this tool more accessible for clinical use. HDF analysis is considered the fluid dynamics correlate of deformation imaging and may be even more sensitive to detect mechanical abnormalities. This has the potential to add incremental clinical value, allowing earlier detection of pathology or immediate evaluation of response to treatment. In this article, the theoretical background and physiological patterns of HDF in the left ventricle are provided. In pathological situations, the HDF pattern might alter, which is illustrated with a case of ST segment elevation myocardial infarction and non-ischemic cardiomyopathy with typical left bundle branch block

    Use of partial least squares regression to impute SNP genotypes in Italian Cattle breeds

    Get PDF
    Background The objective of the present study was to test the ability of the partial least squares regression technique to impute genotypes from low density single nucleotide polymorphisms (SNP) panels i.e. 3K or 7K to a high density panel with 50K SNP. No pedigree information was used. Methods Data consisted of 2093 Holstein, 749 Brown Swiss and 479 Simmental bulls genotyped with the Illumina 50K Beadchip. First, a single-breed approach was applied by using only data from Holstein animals. Then, to enlarge the training population, data from the three breeds were combined and a multi-breed analysis was performed. Accuracies of genotypes imputed using the partial least squares regression method were compared with those obtained by using the Beagle software. The impact of genotype imputation on breeding value prediction was evaluated for milk yield, fat content and protein content. Results In the single-breed approach, the accuracy of imputation using partial least squares regression was around 90 and 94% for the 3K and 7K platforms, respectively; corresponding accuracies obtained with Beagle were around 85% and 90%. Moreover, computing time required by the partial least squares regression method was on average around 10 times lower than computing time required by Beagle. Using the partial least squares regression method in the multi-breed resulted in lower imputation accuracies than using single-breed data. The impact of the SNP-genotype imputation on the accuracy of direct genomic breeding values was small. The correlation between estimates of genetic merit obtained by using imputed versus actual genotypes was around 0.96 for the 7K chip. Conclusions Results of the present work suggested that the partial least squares regression imputation method could be useful to impute SNP genotypes when pedigree information is not available

    Advances in understanding and parameterization of small-scalephysical processes in the marine Arctic climate system: a review

    Get PDF
    The Arctic climate system includes numerous highly interactive small-scale physical processes in the atmosphere, sea ice, and ocean. During and since the International Polar Year 2007–2009, significant advances have been made in understanding these processes. Here, these recent advances are reviewed, synthesized, and discussed. In atmospheric physics, the primary advances have been in cloud physics, radiative transfer, mesoscale cyclones, coastal, and fjordic processes as well as in boundary layer processes and surface fluxes. In sea ice and its snow cover, advances have been made in understanding of the surface albedo and its relationships with snow properties, the internal structure of sea ice, the heat and salt transfer in ice, the formation of superimposed ice and snow ice, and the small-scale dynamics of sea ice. For the ocean, significant advances have been related to exchange processes at the ice–ocean interface, diapycnal mixing, double-diffusive convection, tidal currents and diurnal resonance. Despite this recent progress, some of these small-scale physical processes are still not sufficiently understood: these include wave–turbulence interactions in the atmosphere and ocean, the exchange of heat and salt at the ice–ocean interface, and the mechanical weakening of sea ice. Many other processes are reasonably well understood as stand-alone processes but the challenge is to understand their interactions with and impacts and feedbacks on other processes. Uncertainty in the parameterization of small-scale processes continues to be among the greatest challenges facing climate modelling, particularly in high latitudes. Further improvements in parameterization require new year-round field campaigns on the Arctic sea ice, closely combined with satellite remote sensing studies and numerical model experiments.publishedVersio

    Properties of Foreshocks and Aftershocks of the Non-Conservative SOC Olami-Feder-Christensen Model: Triggered or Critical Earthquakes?

    Get PDF
    Following Hergarten and Neugebauer [2002] who discovered aftershock and foreshock sequences in the Olami-Feder-Christensen (OFC) discrete block-spring earthquake model, we investigate to what degree the simple toppling mechanism of this model is sufficient to account for the properties of earthquake clustering in time and space. Our main finding is that synthetic catalogs generated by the OFC model share practically all properties of real seismicity at a qualitative level, with however significant quantitative differences. We find that OFC catalogs can be in large part described by the concept of triggered seismicity but the properties of foreshocks depend on the mainshock magnitude, in qualitative agreement with the critical earthquake model and in disagreement with simple models of triggered seismicity such as the Epidemic Type Aftershock Sequence (ETAS) model [Ogata, 1988]. Many other features of OFC catalogs can be reproduced with the ETAS model with a weaker clustering than real seismicity, i.e. for a very small average number of triggered earthquakes of first generation per mother-earthquake.Comment: revtex, 19 pages, 8 eps figure

    Three-dimensional echocardiography for left ventricular quantification: fundamental validation and clinical applications

    Get PDF
    One of the earliest applications of clinical echocardiography is evaluation of left ventricular (LV) function and size. Accurate, reproducible and quantitative evaluation of LV function and size is vital for diagnosis, treatment and prediction of prognosis of heart disease. Early three-dimensional (3D) echocardiographic techniques showed better reproducibility than two-dimensional (2D) echocardiography and narrower limits of agreement for assessment of LV function and size in comparison to reference methods, mostly cardiac magnetic resonance (CMR) imaging, but acquisition methods were cumbersome and a lack of user-friendly analysis software initially precluded widespread use. Through the advent of matrix transducers enabling real-time three-dimensional echocardiography (3DE) and improvements in analysis software featuring semi-automated volumetric analysis, 3D echocardiography evolved into a simple and fast imaging modality for everyday clinical use. 3DE provides the possibility to evaluate the entire LV in three spatial dimensions during the complete cardiac cycle, offering a more accurate and complete quantitative evaluation the LV. Improved efficiency in acquisition and analysis may provide clinicians with important diagnostic information within minutes. The current article reviews the methodology and application of 3DE for quantitative evaluation of the LV, provides the scientific evidence for its current clinical use, and discusses its current limitations and potential future directions

    Performance Analysis of Dataflow Architectures Using Timed Coloured Petri Nets

    Full text link
    We present an approach to model dataflow architectures at a high level of abstraction using timed coloured Petri nets. We specifically examine the value of Petri nets for evaluating the performance of such architectures. For this purpose we assess the value of Petri nets both as a modelling technique for dataflow architectures and as an analysis tool that yields valuable performance data for such architectures through the execution of Petri net models. Because our aim is to use the models for performance analysis, we focus on representing the timing and communication behaviour of the architecture rather than the functionality. A modular approach is used to model architectures. We identify five basic hardware building blocks from which Petri net models of dataflow architectures can be constructed. In defining the building blocks we will identify strengths and weaknesses of Petri nets for modelling dataflow architectures. A technique called folding is applied to build generic models of dataflow architectures. A timed coloured Petri net model of the Prophid dataflow architecture, which is being developed at Philips Research Laboratories, is presented. This model has been designed in the tool ExSpect. The performance of the Prophid architecture has been analysed by simulation with this model

    One Net Fits All: A unifying semantics of Dynamic Fault Trees using GSPNs

    Get PDF
    Dynamic Fault Trees (DFTs) are a prominent model in reliability engineering. They are strictly more expressive than static fault trees, but this comes at a price: their interpretation is non-trivial and leaves quite some freedom. This paper presents a GSPN semantics for DFTs. This semantics is rather simple and compositional. The key feature is that this GSPN semantics unifies all existing DFT semantics from the literature. All semantic variants can be obtained by choosing appropriate priorities and treatment of non-determinism.Comment: Accepted at Petri Nets 201

    Left Atrial Deformation Imaging and Atrial Fibrillation in Patients with Rheumatic Mitral Stenosis

    Get PDF
    BackgroundAtrial fibrillation (AF) is a frequent complication of rheumatic mitral stenosis (MS) and is associated with worse outcomes. Prediction of new-onset AF by assessing left atrial (LA) mechanics with speckle-tracking echocardiography might be useful for risk stratification and guiding therapeutic strategies. Therefore, the aim of this study was to assess the association of LA reservoir strain (LASr) and strain rate (LASRr) with AF at follow-up in patients with rheumatic MS.MethodsLeft atrial reservoir strain and LASRr measured by speckle-tracking echocardiography were assessed in 125 patients (mean age, 50 ± 15 years; 80.8% female) with rheumatic MS and without a history of AF. Patients were followed up for the occurrence of a first episode of AF after the index echocardiogram.ResultsDuring a median follow-up of 32 (9.5-70) months, 41 patients (32.8%) developed new-onset AF. Patients who developed AF had significantly more impaired LASr (13.4% ± 5.2% vs 18.9% ± 8.2%; P -1 vs 0.98 ± 0.36 s-1;P -1 were independently associated with the development of AF at follow-up (hazard ratio = 7.03, 95% CI, 2.08-23.77, P = .002; and hazard ratio = 3.42, 95% CI, 1.59-7.34, P = .002, respectively).ConclusionsLASr and LASRr are impaired in patients with rheumatic MS, and the degree of impairment is associated with new-onset AF at follow-up.</p

    Location-Aware Quality of Service Measurements for Service-Level Agreements

    Get PDF
    We add specifications of location-aware measurements to performance models in a compositional fashion, promoting precision in performance measurement design. Using immediate actions to send control signals between measurement components we are able to obtain more accurate measurements from our stochastic models without disturbing their structure. A software tool processes both the model and the measurement specifications to give response time distributions and quantiles, an essential calculation in determining satisfaction of service-level agreements (SLAs)
    • …
    corecore