45 research outputs found

    Colonic volume in patients with functional constipation or irritable bowel syndrome determined by magnetic resonance imaging

    Get PDF
    BACKGROUND: Functional constipation (FC) and irritable bowel syndrome constipation type (IBS‐C) share many similarities, and it remains unknown whether they are distinct entities or part of the same spectrum of disease. Magnetic resonance imaging (MRI) allows quantification of intraluminal fecal volume. We hypothesized that colonic volumes of patients with FC would be larger than those of patients with IBS‐C, and that both patient groups would have larger colonic volumes than healthy controls (HC). METHODS: Based on validated questionnaires, three groups of participants were classified into FC (n = 13), IBS‐C (n = 10), and HC (n = 19). The colonic volume of each subject was determined by MRI. Stool consistency was described by the Bristol stool scale and colonic transit times were assessed with radiopaque makers. KEY RESULTS: Overall, total colonic volumes were different in the three groups, HC (median 629 ml, interquartile range (IQR)(562–868)), FC (864 ml, IQR(742–940)), and IBS‐C (520 ml IQR(489–593)) (p = 0.001). Patients with IBS‐C had lower colonic volumes than patients with FC (p = 0.001) and HC (p = 0.019), but there was no difference between FC and HC (p = 0.10). Stool consistency was similar in the two patient groups, but patients with FC had longer colonic transit time than those with IBS‐C (117.6 h versus 43.2 h, p = 0.019). CONCLUSION: Patients with IBS‐C have lower total colonic volumes and shorter colonic transit times than patients with FC. Future studies are needed to confirm that colonic volume allows objective distinction between the two conditions

    Danish study of Non-Invasive testing in Coronary Artery Disease (Dan-NICAD):study protocol for a randomised controlled trial

    Get PDF
    BACKGROUND: Coronary computed tomography angiography (CCTA) is an established method for ruling out coronary artery disease (CAD). Most patients referred for CCTA do not have CAD and only approximately 20–30 % of patients are subsequently referred to further testing by invasive coronary angiography (ICA) or non-invasive perfusion evaluation due to suspected obstructive CAD. In cases with severe calcifications, a discrepancy between CCTA and ICA often occurs, leading to the well-described, low-diagnostic specificity of CCTA. As ICA is cost consuming and involves a risk of complications, an optimized algorithm would be valuable and could decrease the number of ICAs that do not lead to revascularization. The primary objective of the Dan-NICAD study is to determine the diagnostic accuracy of cardiac magnetic resonance imaging (CMRI) and myocardial perfusion scintigraphy (MPS) as secondary tests after a primary CCTA where CAD could not be ruled out. The secondary objective includes an evaluation of the diagnostic precision of an acoustic technology that analyses the sound of coronary blood flow. It may potentially provide better stratification prior to CCTA than clinical risk stratification scores alone. METHODS/DESIGN: Dan-NICAD is a multi-centre, randomised, cross-sectional trial, which will include approximately 2,000 patients without known CAD, who were referred to CCTA due to a history of symptoms suggestive of CAD and a low-risk to intermediate-risk profile, as evaluated by a cardiologist. Patient interview, sound recordings, and blood samples are obtained in connection with the CCTA. All patients with suspected obstructive CAD by CCTA are randomised to either stress CMRI or stress MPS, followed by ICA with fractional flow reserve (FFR) measurements. Obstructive CAD is defined as an FFR below 0.80 or as high-grade stenosis (>90 % diameter stenosis) by visual assessment. Diagnostic performance is evaluated as sensitivity, specificity, predictive values, likelihood ratios, and C statistics. Enrolment commenced in September 2014 and is expected to be complete in May 2016. DISCUSSION: Dan-NICAD is designed to assess whether a secondary perfusion examination after CCTA could safely reduce the number of ICAs where revascularization is not required. The results are expected to add knowledge about the optimal algorithm for diagnosing CAD. TRIAL REGISTRATION: Clinicaltrials.gov identifier, NCT02264717. Registered on 26 September 2014. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1186/s13063-016-1388-z) contains supplementary material, which is available to authorized users

    Three dimensional three component whole heart cardiovascular magnetic resonance velocity mapping: comparison of flow measurements from 3D and 2D acquisitions

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two-dimensional, unidirectionally encoded, cardiovascular magnetic resonance (CMR) velocity mapping is an established technique for the quantification of blood flow in large vessels. However, it requires an operator to correctly align the planes of acquisition. If all three directional components of velocity are measured for each voxel of a 3D volume through the phases of the cardiac cycle, blood flow through any chosen plane can potentially be calculated retrospectively. The initial acquisition is then more time consuming but relatively operator independent.</p> <p>Aims</p> <p>To compare the curves and volumes of flow derived from conventional 2D and comprehensive 3D flow acquisitions in a steady state flow model, and in vivo through planes transecting the ascending aorta and pulmonary trunk in 10 healthy volunteers.</p> <p>Methods</p> <p>Using a 1.5 T Phillips Intera CMR system, 3D acquisitions used an anisotropic 3D segmented k-space phase contrast gradient echo sequence with a short EPI readout, with prospective ECG and diaphragm navigator gating. The 2D acquisitions used segmented k-space phase contrast with prospective ECG and diaphragm navigator gating. Quantitative flow analyses were performed retrospectively with dedicated software for both the in vivo and in vitro acquisitions.</p> <p>Results</p> <p>Analysis of in vitro data found the 3D technique to have overestimated the continuous flow rate by approximately 5% across the entire applied flow range. In vivo, the 2D and the 3D techniques yielded similar volumetric flow curves and measurements. Aortic flow: (mean ± SD), 2D = 89.5 ± 13.5 ml & 3D = 92.7 ± 17.5 ml. Pulmonary flow: 2D = 98.8 ± 18.4 ml & 3D = 94.9 ± 19.0 ml). Each in vivo 3D acquisition took about 8 minutes or more.</p> <p>Conclusion</p> <p>Flow measurements derived from the 3D and 2D acquisitions were comparable. Although time consuming, comprehensive 3D velocity acquisition could be relatively operator independent, and could potentially yield information on flow through several retrospectively chosen planes, for example in patients with congenital or valvular heart disease.</p

    Abstracts from the Food Allergy and Anaphylaxis Meeting 2016

    Get PDF

    Sequential Design Process for Screening and Optimization of Robust and Reliability Based on Finite Element Analysis and Meta-Modelling

    No full text
    A new medical device can take years to develop from early concept to product launch. Three approaches are often combined to mitigate risks: Failure Modes and Effects Analysis (FMEA), simulation and modeling, and physical test programs. Although widely used, all three approaches are generally time-consuming and have their shortcomings: The risk probabilities in FMEA\u27s are often based on educated guesses, even in later development stages as data on the distribution of performance is not available. Thus, the traditional use of safety factors in structural analysis versus the probabilistic approach to risk management presents an obvious misfit. Therefore, the above three approaches are not ideal for addressing the design engineer\u27s key question; how should the design be changed to improve robustness and failure rates. The present work builds upon the existing Robust and Reliability-Based Design Optimization (R2BDO) and adjusts it to address the key questions above using Finite Element Analysis (FEA). The two main features of the presented framework are screening feasible design concepts early in the embodiment phase and subsequently optimizing the design\u27s probabilistic performance (i.e., reduce failure rates) while using minimal computational resources. A case study in collaboration with a medical design and manufacturing company demonstrates the new framework. The optimization minimizes the failure rate (and improves design robustness) concerning three constraint functions (torque, strain, and contact pressure). Furthermore, the study finds that the new framework significantly improves the design\u27s performance function (failure rate) with limited computational resources
    corecore