63 research outputs found

    Efficacy of Different Beta-Blockers in the Treatment of Long QT Syndrome

    Get PDF
    AbstractBackgroundIn LQTS, β-blocker therapy is effective in reducing the risk of cardiac events (syncope, aborted cardiac arrest, sudden cardiac death). Limited studies have compared the efficacy of different β-blockers.ObjectivesThe goal of this study was to compare the efficacy of different β-blockers in long QT syndrome (LQTS) and in genotype-positive patients with LQT1 and LQT2.MethodsThe study included 1,530 patients from the Rochester, New York–based LQTS Registry who were prescribed common β-blockers (atenolol, metoprolol, propranolol, or nadolol). Time-dependent Cox regression analyses were used to compare the efficacy of different β-blockers with the risk of cardiac events in LQTS.ResultsRelative to being off β-blockers, the hazard ratios and 95% confidence intervals (CIs) for first cardiac events for atenolol, metoprolol, propranolol, and nadolol were 0.71 (0.50 to 1.01), 0.70 (0.43 to 1.15) 0.65 (0.46 to 0.90), and 0.51 (0.35 to 0.74), respectively. In LQT1, the risk reduction for first cardiac events was similar among the 4 β-blockers, but in LQT2, nadolol provided the only significant risk reduction (hazard ratio: 0.40 [0.16 to 0.98]). Among patients who had a prior cardiac event while taking β-blockers, efficacy for recurrent events differed by drug (p = 0.004), and propranolol was the least effective compared with the other β-blockers.ConclusionsAlthough the 4 β-blockers are equally effective in reducing the risk of a first cardiac event in LQTS, their efficacy differed by genotype; nadolol was the only β-blocker associated with a significant risk reduction in patients with LQT2. Patients experiencing cardiac events during β-blocker therapy are at high risk for subsequent cardiac events, and propranolol is the least effective drug in this high-risk group

    Ethics of sham surgery: Perspective of patients

    Full text link
    Sham surgery is used as a control condition in neurosurgical clinical trials in Parkinson's disease (PD) but remains controversial. This study aimed to assess the perspective of patients with PD and the general public on the use of sham surgery controls. We surveyed consecutive patients from a university-based neurology outpatient clinic and a community-based general internal medicine practice. Background information was provided regarding PD and two possible methods of testing the efficacy of a novel gene transfer procedure, followed by questions that addressed participants' opinions related to the willingness to participate and permissibility of blinded and unblinded trial designs. Two hundred eighty-eight (57.6%) patients returned surveys. Patients with PD expressed less willingness to participate in the proposed gene transfer surgery trials. Unblinded studies received greater support, but a majority would still allow the use of sham surgery. Those in favor of sham surgery were more educated and more likely to use societal perspective rationales. Patients with PD are more cautious about surgical research participation than patients with non-PD. Their policy views were similar to others', with a majority supporting the use of sham controls. Future research needs to determine whether eliciting more considered judgments of laypersons would reveal different levels of support for sham surgery. © 2007 Movement Disorder SocietyPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/57916/1/21775_ftp.pd

    Utility of serum procalcitonin values in patients with acute exacerbations of chronic obstructive pulmonary disease: a cautionary note

    Get PDF
    Background: Serum procalcitonin levels have been used as a biomarker of invasive bacterial infection and recently have been advocated to guide antibiotic therapy in patients with chronic obstructive pulmonary disease (COPD). However, rigorous studies correlating procalcitonin levels with microbiologic data are lacking. Acute exacerbations of COPD (AECOPD) have been linked to viral and bacterial infection as well as noninfectious causes. Therefore, we evaluated procalcitonin as a predictor of viral versus bacterial infection in patients hospitalized with AECOPD with and without evidence of pneumonia. Methods: Adults hospitalized during the winter with symptoms consistent with AECOPD underwent extensive testing for viral, bacterial, and atypical pathogens. Serum procalcitonin levels were measured on day 1 (admission), day 2, and at one month. Clinical and laboratory features of subjects with viral and bacterial diagnoses were compared.Results: In total, 224 subjects with COPD were admitted for 240 respiratory illnesses. Of these, 56 had pneumonia and 184 had AECOPD alone. A microbiologic diagnosis was made in 76 (56%) of 134 illnesses with reliable bacteriology (26 viral infection, 29 bacterial infection, and 21 mixed viral bacterial infection). Mean procalcitonin levels were significantly higher in patients with pneumonia compared with AECOPD. However, discrimination between viral and bacterial infection using a 0.25 ng/mL threshold for bacterial infection in patients with AECOPD was poor. Conclusion: Procalcitonin is useful in COPD patients for alerting clinicians to invasive bacterial infections such as pneumonia but it does not distinguish bacterial from viral and noninfectious causes of AECOPD

    Risk Factors for Recurrent Syncope and Subsequent Fatal or Near-Fatal Events in Children and Adolescents With Long QT Syndrome

    Get PDF
    ObjectivesWe aimed to identify risk factors for recurrent syncope in children and adolescents with congenital long QT syndrome (LQTS).BackgroundData regarding risk assessment in LQTS after the occurrence of the first syncope episode are limited.MethodsThe Prentice-Williams-Peterson conditional gap time model was used to identify risk factors for recurrent syncope from birth through age 20 years among 1,648 patients from the International Long QT Syndrome Registry.ResultsMultivariate analysis demonstrated that corrected QT interval (QTc) duration (≥500 ms) was a significant predictor of a first syncope episode (hazard ratio: 2.16), whereas QTc effect was attenuated when the end points of the second, third, and fourth syncope episodes were evaluated (hazard ratios: 1.29, 0.99, 0.90, respectively; p < 0.001 for the null hypothesis that all 4 hazard ratios are identical). A genotype-specific subanalysis showed that during childhood (0 to 12 years), males with LQTS type 1 had the highest rate of a first syncope episode (p = 0.001) but exhibited similar rates of subsequent events as other genotype-sex subsets (p = 0.63). In contrast, in the age range of 13 to 20 years, long QT syndrome type 2 females experienced the highest rate of both first and subsequent syncope events (p < 0.001 and p = 0.01, respectively). Patients who experienced ≥1 episodes of syncope had a 6- to 12-fold (p < 0.001 for all) increase in the risk of subsequent fatal/near-fatal events independently of QTc duration. Beta-blocker therapy was associated with a significant reduction in the risk of recurrent syncope and subsequent fatal/near-fatal events.ConclusionsChildren and adolescents who present after an episode of syncope should be considered to be at a high risk of the development of subsequent syncope episodes and fatal/near-fatal events regardless of QTc duration

    Survival Methods

    No full text

    Subset Selection in Explanatory Regression Analyses

    No full text
    We present a new, data-driven method for automatically choosing a good subset of potential confounding variables to include in an explanatory linear regression model. This same model selection scheme can also be used in a less focused analysis to simply identify those variables that are jointly related to the response. Our procedure differs from most existing subset selection procedures in that it is directly aimed at finding a model family which gives rise to good estimates of specified coefficients rather than predicted values. The performance of our method is demonstrated in a simulation study where interest is focused one estimating a particular conditional association. The relative impact of selection bias due to dual use of the data to both select the subset and estimate the regression parameters is also investigated

    Local Cross-Validated Smoothing Parameter Estimation for Linear Smoothers

    No full text
    Thesis (Ph.D.)--University of Rochester. School of Medicine & Dentistry. Dept. of Biostatistics & Computational Biology, 2017.Nonparametrically estimating a regression function with varying degrees of smoothness or heteroscedasticity can benefit from a smoother that uses a data-adaptive smoothing parameter function to efficiently capture the local features. Leave-one-out cross-validation (LOO CV) has been used to select global smoothing parameters, as it is expected to estimate the true mean integrated squared error (MISE), but it often leads to undersmoothing in cases with sharp changes in smoothness and heteroscedasticity. Oracle simulations show that simply moving from a globally-chosen to a locally-chosen smoothing parameter yields a reduction in MISE. We explore LOO CV as a method of estimating the mean squared error as a function of the point of estimation, MSE(x), in order to estimate a smoothing parameter function. We identify a relationship between the Squared Leave-One-Out cross-validated Residuals (SLOORs) and MSE(x) for general linear smoothers. We use this identity to estimate MSE(x) and obtain improved smoothing parameter function estimates. This proposal presents a portfolio of smoothers based on local polynomials and natural cubic smoothing splines that estimate and use a data-adaptive smoothing parameter function by employing Local Cross-Validation (LCV). Data is locally weighted by a proposed truncated gaussian kernel function with sample-size adaptive truncation thresholds. The proposed Local Cross-Validated Polynomial smoothing algorithm (LCVPoly) estimates and uses an adaptive bandwidth function for any specified polynomial order. LCVPoly can further select the preferred global polynomial order and adaptive orders are explored to permit greater flexibility. The relationship of the variance function estimation problem to the mean function estimation problem is evident in the SLOOR-MSE identity. These methods only require specification of bandwidth bounds and polynomial orders. Available methods intended to handle underlying functions of varying smoothness are reviewed as competitors to our proposed algorithms. While local polynomials use both bandwidth and polynomial order to control smoothness, smoothing splines use a single smoothing parameter. Because of this, we propose a single version of our Local Cross-Validated Spline (LCVSpline) smoothing algorithm to estimate and use an adaptive degree-of-freedom function. As smoothing splines are linear smoothers, the SLOOR-MSE relationship holds here as well and we can use the result for degree-of-freedom function estimation. Electrocardiograms (ECGs) measured over a 24-hour period are heteroscedastic and can be very noisy, which can mask short-term cardiovascular events of interest. This type of data can benefit from a smoother that can pick up both short-term events and long-term changes while appropriately smoothing out the noise. Current techniques to smooth ECG data use a moving median smoother with no guide on the size of the moving window. We show how our proposed methods and other available methods perform on a dataset of over 80,000 heart inter-beat intervals. In addition to this data, we also employ our methods on the well-known motorcycle acceleration data set typically used to demonstrate spatially adaptive smoothers

    Subset Selection for High-Dimensional Data, with Applications to Gene Array Data

    No full text
    Thesis (Ph.D.)--University of Rochester. School of Medicine and Dentistry. Dept. of Biostatistics and Computational Biology, 2008.Identifying those genes that are differentially expressed in individuals with cancer could lead to new avenues of treatment or prevention. Gene array information can be collected for an individual and paired with information about a specific outcome. We consider binary and censored time to event survival outcomes. Improved construction of prognostic gene signatures − risk scores based on groups of genes that can relatively accurately predict an individual’s outcome or risk thereof − would be a step forward in gene array research. We consider the problem of selecting which variables (genes) to include in a prognostic gene signature with high dimensional data in the logistic regression model and the Cox proportional hazards model frameworks. Constructing a prognostic gene signature from gene array data is very difficult because the number of candidate genes greatly outnumbers the sample size available in most studies. In this case, commonly used statistical techniques perform poorly at choosing genes that accurately predict the endpoint of interest. We propose a search algorithm that lies between forward selection and searching over all subsets. This method uses an evolving subgroup paradigm to intelligently select variables over a relatively large model space. We show that our method applied to the Cox proportional hazards model for censored survival time data or the logistic regression model for binary data can yield significant improvements in the number of predictive variables selected and the prediction error compared with the LASSO method, forward selection, and univariate selection, as demonstrated with a simulation study. We also investigate validation methods. Censored survival time data confounds many of the common validation methods because the outcome is not always observed, and splitting a data set can lead to a lack of uncensored observations in some partitions. We pursue validation with the goal of selecting the ideal number of variables to include. We propose a robust version of partial likelihood cross-validation and modified versions of the c-index as metrics to select model complexity. We propose analogous methods in the logistic model setting. We apply our new methods to publicly available lymphoma gene array data (Rosenwald et. al., 2002) with 7399 genes and 240 individuals. After splitting the full data into training and validation portions, running our method on the training data set leads to predictive results in the validation data set

    Smooth Estimation and Inference with Interval Censored Data

    No full text
    In many biostatistical applications one is concerned with estimating the distribution of a survival time T. This time T is interval censored if its current status is observed at several monitoring times so that one observes an interval containing T. Construction of confidence intervals for smooth functionals of the survival function and smooth estimators with confidence limits for the survival function itself are open problems. In this paper we provide solutions to these problems, assuming that the observed monitoring times are independent of T. Our proposed smooth estimator is a standard univariate kernal regression smoother applied to the pooled sample of N dependent current status observations. The proposed kernal smoother converges at the optimal rate to a normal distribution identical to that of a kernal smoother applied to N independent and identically distributed current status observations with the same set of monitoring times. We show that existing bandwidth selection techniques and confidence limit procedures for standard nonparametric regressions yield the correct answers despite the dependence in the pooled sample. Our results give rise to the practical insight that for large n, each additional monitoring time for subjects already monitored actually carries as much or more information for estimation of the survival function than a monitoring time for a new subject. We provide a simple method for constructing confidence intervals for smooth functionals of the survival function. A simulation demonstrates the excellent performance of these confidence intervals and we apply the method to analyze the HIV seroconversion distribution for a cohort of hemophiliacs

    Can Analysis of Routine Viral Testing Provide Accurate Estimates of Respiratory Syncytial Virus Disease Burden in Adults?

    No full text
    Respiratory syncytial virus (RSV) is increasingly recognized as a significant cause of adult respiratory illness. We evaluated routine viral testing and discharge diagnoses for identifying RSV and influenza burden. Polymerase chain reaction results performed in adults during emergency room visits or hospitalizations were reviewed. Peak RSV activity preceded influenza activity by 8 weeks. The ratio of total number of viral tests performed divided by total number of respiratory visits was higher during influenza than RSV peaks (1.31 vs 0.72; P = .0001). Influenza and RSV were listed primary diagnoses in 56 (30%) vs 7 (6%), respectively (P \u3c .0001). Routine viral testing to estimate adult RSV disease burden has limitations
    • …
    corecore