770 research outputs found

    Temporary epicardial cardiac resynchronisation versus conventional right ventricular pacing after cardiac surgery: study protocol for a randomised control trial

    Get PDF
    Background: Heart failure patients with stable angina, acute coronary syndromes and valvular heart disease may benefit from revascularisation and/or valve surgery. However, the mortality rate is increased- 5-30%. Biventricular pacing using temporary epicardial wires after surgery is a potential mechanism to improve cardiac function and clinical endpoints. Method/design: A multi-centred, prospective, randomised, single-blinded, intervention-control trial of temporary biventricular pacing versus standard pacing. Patients with ischaemic cardiomyopathy, valvular heart disease or both, an ejection fraction ≤ 35% and a conventional indication for cardiac surgery will be recruited from 2 cardiac centres. Baseline investigations will include: an electrocardiogram to confirm sinus rhythm and measure QRS duration; echocardiogram to evaluate left ventricular function and markers of mechanical dyssynchrony; dobutamine echocardiogram for viability and blood tests for renal function and biomarkers of myocardial injury- troponin T and brain naturetic peptide. Blood tests will be repeated at 18, 48 and 72 hours. The principal exclusions will be subjects with permanent atrial arrhythmias, permanent pacemakers, infective endocarditis or end-stage renal disease. After surgery, temporary pacing wires will be attached to the postero-lateral wall of the left ventricle, the right atrium and right ventricle and connected to a triple chamber temporary pacemaker. Subjects will be randomised to receive either temporary biventricular pacing or standard pacing (atrial inhibited pacing or atrial-synchronous right ventricular pacing) for 48 hours. The primary endpoint will be the duration of level 3 care. In brief, this is the requirement for invasive ventilation, multi-organ support or more than one inotrope/vasoconstrictor. Haemodynamic studies will be performed at baseline, 6, 18 and 24 hours after surgery using a pulmonary arterial catheter. Measurements will be taken in the following pacing modes: atrial inhibited; right ventricular only; atrial synchronous-right ventricular; atrial synchronous-left ventricular and biventricular pacing. Optimisation of the atrioventricular and interventricular delay will be performed in the biventricular pacing group at 18 hours. The effect of biventricular pacing on myocardial injury, post operative arrhythmias and renal function will also be quantified

    Dental profile of patients with Gaucher disease

    Get PDF
    BACKGROUND: This study was conducted to determine whether patients with Gaucher disease had significant dental pathology because of abnormal bone structure, pancytopenia, and coagulation abnormalities. METHODS: Each patient received a complete oral and periodontal examination in addition to a routine hematological evaluation. RESULTS: Gaucher patients had significantly fewer carious lesions than otherwise healthy carriers. Despite prevalence of anemia, there was no increase in gingival disease; despite the high incidence of thrombocytopenia, gingival bleeding was not noted; and despite radiological evidence of bone involvement, there was no greater incidence loss of teeth or clinical tooth mobility. CONCLUSIONS: These data represent the first survey of the oral health of a large cohort of patients with Gaucher disease. It is a pilot study of a unique population and the results of the investigation are indications for further research. Based on our findings, we recommend regular oral examinations with appropriate dental treatment for patients with Gaucher disease as for other individuals. Consultation between the dentist and physician, preferably one with experience with Gaucher disease, should be considered when surgical procedures are planned

    Improving Cancer Classification Accuracy Using Gene Pairs

    Get PDF
    Recent studies suggest that the deregulation of pathways, rather than individual genes, may be critical in triggering carcinogenesis. The pathway deregulation is often caused by the simultaneous deregulation of more than one gene in the pathway. This suggests that robust gene pair combinations may exploit the underlying bio-molecular reactions that are relevant to the pathway deregulation and thus they could provide better biomarkers for cancer, as compared to individual genes. In order to validate this hypothesis, in this paper, we used gene pair combinations, called doublets, as input to the cancer classification algorithms, instead of the original expression values, and we showed that the classification accuracy was consistently improved across different datasets and classification algorithms. We validated the proposed approach using nine cancer datasets and five classification algorithms including Prediction Analysis for Microarrays (PAM), C4.5 Decision Trees (DT), Naive Bayesian (NB), Support Vector Machine (SVM), and k-Nearest Neighbor (k-NN)

    Biomarker discovery in heterogeneous tissue samples -taking the in-silico deconfounding approach

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>For heterogeneous tissues, such as blood, measurements of gene expression are confounded by relative proportions of cell types involved. Conclusions have to rely on estimation of gene expression signals for homogeneous cell populations, e.g. by applying micro-dissection, fluorescence activated cell sorting, or <it>in-silico </it>deconfounding. We studied feasibility and validity of a non-negative matrix decomposition algorithm using experimental gene expression data for blood and sorted cells from the same donor samples. Our objective was to optimize the algorithm regarding detection of differentially expressed genes and to enable its use for classification in the difficult scenario of reversely regulated genes. This would be of importance for the identification of candidate biomarkers in heterogeneous tissues.</p> <p>Results</p> <p>Experimental data and simulation studies involving noise parameters estimated from these data revealed that for valid detection of differential gene expression, quantile normalization and use of non-log data are optimal. We demonstrate the feasibility of predicting proportions of constituting cell types from gene expression data of single samples, as a prerequisite for a deconfounding-based classification approach.</p> <p>Classification cross-validation errors with and without using deconfounding results are reported as well as sample-size dependencies. Implementation of the algorithm, simulation and analysis scripts are available.</p> <p>Conclusions</p> <p>The deconfounding algorithm without decorrelation using quantile normalization on non-log data is proposed for biomarkers that are difficult to detect, and for cases where confounding by varying proportions of cell types is the suspected reason. In this case, a deconfounding ranking approach can be used as a powerful alternative to, or complement of, other statistical learning approaches to define candidate biomarkers for molecular diagnosis and prediction in biomedicine, in realistically noisy conditions and with moderate sample sizes.</p

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV

    Get PDF
    The azimuthal anisotropy of charged particles in PbPb collisions at nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS detector at the LHC over an extended transverse momentum (pt) range up to approximately 60 GeV. The data cover both the low-pt region associated with hydrodynamic flow phenomena and the high-pt region where the anisotropies may reflect the path-length dependence of parton energy loss in the created medium. The anisotropy parameter (v2) of the particles is extracted by correlating charged tracks with respect to the event-plane reconstructed by using the energy deposited in forward-angle calorimeters. For the six bins of collision centrality studied, spanning the range of 0-60% most-central events, the observed v2 values are found to first increase with pt, reaching a maximum around pt = 3 GeV, and then to gradually decrease to almost zero, with the decline persisting up to at least pt = 40 GeV over the full centrality range measured.Comment: Replaced with published version. Added journal reference and DO

    Search for the standard model Higgs boson in the H to ZZ to 2l 2nu channel in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    A search for the standard model Higgs boson in the H to ZZ to 2l 2nu decay channel, where l = e or mu, in pp collisions at a center-of-mass energy of 7 TeV is presented. The data were collected at the LHC, with the CMS detector, and correspond to an integrated luminosity of 4.6 inverse femtobarns. No significant excess is observed above the background expectation, and upper limits are set on the Higgs boson production cross section. The presence of the standard model Higgs boson with a mass in the 270-440 GeV range is excluded at 95% confidence level.Comment: Submitted to JHE
    • …
    corecore