18 research outputs found

    X chromosome inactivation does not necessarily determine the severity of the phenotype in Rett syndrome patients

    Get PDF
    Rett syndrome (RTT) is a severe neurological disorder usually caused by mutations in the MECP2 gene. Since the MECP2 gene is located on the X chromosome, X chromosome inactivation (XCI) could play a role in the wide range of phenotypic variation of RTT patients; however, classical methylation-based protocols to evaluate XCI could not determine whether the preferentially inactivated X chromosome carried the mutant or the wild-type allele. Therefore, we developed an allele-specific methylation-based assay to evaluate methylation at the loci of several recurrent MECP2 mutations. We analyzed the XCI patterns in the blood of 174 RTT patients, but we did not find a clear correlation between XCI and the clinical presentation. We also compared XCI in blood and brain cortex samples of two patients and found differences between XCI patterns in these tissues. However, RTT mainly being a neurological disease complicates the establishment of a correlation between the XCI in blood and the clinical presentation of the patients. Furthermore, we analyzed MECP2 transcript levels and found differences from the expected levels according to XCI. Many factors other than XCI could affect the RTT phenotype, which in combination could influence the clinical presentation of RTT patients to a greater extent than slight variations in the XCI pattern

    Codazzi fields on surfaces immersed in Euclidean 4-space

    No full text

    MDR1 C3435T polymorphism in mexican children with acute lymphoblastic leukemia and in healthy individuals

    No full text
    To determine the influence of the MDR1 C3435T polymorphism on the development of childhood acute lymphoblastic leukemia (ALL), we studied 107 children with ALL and 111 healthy subjects. All subjects were genotyped for the C3435T polymorphism using the polymerase chain reaction-restriction fragment length polymorphism bmethod. The genotype frequencies in the patients were 17% homozygous CC, 61% heterozygous CT, and 22% homozygous TT; in healthy individuals the genotype frequencies were 14% CC, 53% CT, and 33% TT. In patients with ALL the allele frequencies were 0.47 for the C allele and 0.53 for the T allele; in the healthy group these allele frequencies were 0.40 and 0.60 for the C and T alleles, respectively. No significant differences in allele frequency (p > 0.176) and genotype frequency (p > 0.255) were detected between the two groups. These findings suggest that the CT or TT genotype does not increase the risk for childhood ALL in Mexican patients. On the other hand, significant differences in allele frequencies were detected in the comparison of Mexican healthy subjects with other populations. Whether these differences are fortuitous or related to diverse genetic backgrounds remains to be elucidated. � 2008 Wayne State University Press

    Improving coeliac disease risk prediction by testing non-HLA variants additional to HLA variants

    No full text
    Background The majority of coeliac disease (CD) patients are not being properly diagnosed and therefore remain untreated, leading to a greater risk of developing CD-associated complications. The major genetic risk heterodimer, HLA-DQ2 and DQ8, is already used clinically to help exclude disease. However, approximately 40% of the population carry these alleles and the majority never develop CD. Objective We explored whether CD risk prediction can be improved by adding non-HLA-susceptible variants to common HLA testing. Design We developed an average weighted genetic risk score with 10, 26 and 57 single nucleotide polymorphisms (SNP) in 2675 cases and 2815 controls and assessed the improvement in risk prediction provided by the non-HLA SNP. Moreover, we assessed the transferability of the genetic risk model with 26 non-HLA variants to a nested case-control population (n=1709) and a prospective cohort (n=1245) and then tested how well this model predicted CD outcome for 985 independent individuals. Results Adding 57 non-HLA variants to HLA testing showed a statistically significant improvement compared to scores from models based on HLA only, HLA plus 10 SNP and HLA plus 26 SNP. With 57 non-HLA variants, the area under the receiver operator characteristic curve reached 0.854 compared to 0.823 for HLA only, and 11.1% of individuals were reclassified to a more accurate risk group. We show that the risk model with HLA plus 26 SNP is useful in independent populations. Conclusions Predicting risk with 57 additional non-HLA variants improved the identification of potential CD patients. This demonstrates a possible role for combined HLA and non-HLA genetic testing in diagnostic work for CD

    Impact of sex, age, and risk factors for venous thromboembolism on the initial presentation of first isolated symptomatic acute deep vein thrombosis

    No full text
    Thrombosis and Hemostasi

    Outcomes in Neurosurgical Patients Who Develop Venous Thromboembolism

    No full text
    International audienceOBJECTIVES: Registro Informatizado de Enfermedad TromboEmbólica (RIETE) database was used to investigate whether neurosurgical patients with venous thromboembolism (VTE) were more likely to die of bleeding or VTE and the influence of anticoagulation on these outcomes.METHODS:Clinical characteristics, treatment details, and 3-month outcomes were assessed in those who developed VTE after neurosurgery.RESULTS: Of 40 663 patients enrolled, 392 (0.96%) had VTE in less than 60 days after neurosurgery. Most patients in the cohort (89%) received initial therapy with low-molecular-weight heparin, (33% received subtherapeutic doses). In the first week, 10 (2.6%) patients died (8 with pulmonary embolism [PE], no bleeding deaths; P = .005). After the first week, 20 (5.1%) patients died (2 with fatal bleeding, none from PE). Overall, this cohort was more likely to develop a fatal PE than a fatal bleed (8 vs 2 deaths, P = .058).CONCLUSIONS: Neurosurgical patients developing VTE were more likely to die from PE than from bleeding in the first week, despite anticoagulation

    Development and validation of a score to predict postoperative respiratory failure in a multicentre European cohort : A prospective, observational study

    No full text
    BACKGROUND Postoperative respiratory failure (PRF) is the most frequent respiratory complication following surgery. OBJECTIVE The objective of this study was to build a clinically useful predictive model for the development of PRF. DESIGN A prospective observational study of a multicentre cohort. SETTING Sixty-three hospitals across Europe. PATIENTS Patients undergoing any surgical procedure under general or regional anaesthesia during 7-day recruitment periods. MAIN OUTCOME MEASURES Development of PRF within 5 days of surgery. PRF was defined by a partial pressure of oxygen in arterial blood (PaO2) less than 8 kPa or new onset oxyhaemoglobin saturation measured by pulse oximetry (SpO(2)) less than 90% whilst breathing room air that required conventional oxygen therapy, noninvasive or invasive mechanical ventilation. RESULTS PRF developed in 224 patients (4.2% of the 5384 patients studied). In-hospital mortality [95% confidence interval (95% CI)] was higher in patients who developed PRF [10.3% (6.3 to 14.3) vs. 0.4% (0.2 to 0.6)]. Regression modelling identified a predictive PRF score that includes seven independent risk factors: low preoperative SpO(2); at least one preoperative respiratory symptom; preoperative chronic liver disease; history of congestive heart failure; open intrathoracic or upper abdominal surgery; surgical procedure lasting at least 2 h; and emergency surgery. The area under the receiver operating characteristic curve (c-statistic) was 0.82 (95% CI 0.79 to 0.85) and the Hosmer-Lemeshow goodness-of-fit statistic was 7.08 (P = 0.253). CONCLUSION A risk score based on seven objective, easily assessed factors was able to predict which patients would develop PRF. The score could potentially facilitate preoperative risk assessment and management and provide a basis for testing interventions to improve outcomes. The study was registered at ClinicalTrials.gov (identifier NCT01346709)

    Human immunodeficiency virus continuum of care in 11 european union countries at the end of 2016 overall and by key population: Have we made progress?

    Get PDF
    Background. High uptake of antiretroviral treatment (ART) is essential to reduce human immunodeficiency virus (HIV) transmission and related mortality; however, gaps in care exist. We aimed to construct the continuum of HIV care (CoC) in 2016 in 11 European Union (EU) countries, overall and by key population and sex. To estimate progress toward the Joint United Nations Programme on HIV/AIDS (UNAIDS) 90-90-90 target, we compared 2016 to 2013 estimates for the same countries, representing 73% of the population in the region. Methods. A CoC with the following 4 stages was constructed: number of people living with HIV (PLHIV); proportion of PLHIV diagnosed; proportion of those diagnosed who ever initiated ART; and proportion of those ever treated who achieved viral suppression at their last visit. Results. We estimated that 87% of PLHIV were diagnosed; 92% of those diagnosed had ever initiated ART; and 91% of those ever on ART, or 73% of all PLHIV, were virally suppressed. Corresponding figures for men having sex with men were: 86%, 93%, 93%, 74%; for people who inject drugs: 94%, 88%, 85%, 70%; and for heterosexuals: 86%, 92%, 91%, 72%. The proportion suppressed of all PLHIV ranged from 59% to 86% across countries. Conclusions. The EU is close to the 90-90-90 target and achieved the UNAIDS target of 73% of all PLHIV virally suppressed, significant progress since 2013 when 60% of all PLHIV were virally suppressed. Strengthening of testing programs and treatment support, along with prevention interventions, are needed to achieve HIV epidemic control
    corecore