26 research outputs found

    Development of a decision support tool to facilitate primary care management of patients with abnormal liver function tests without clinically apparent liver disease [HTA03/38/02]. Abnormal Liver Function Investigations Evaluation (ALFIE)

    Get PDF
    Liver function tests (LFTs) are routinely performed in primary care, and are often the gateway to further invasive and/or expensive investigations. Little is known of the consequences in people with an initial abnormal liver function (ALF) test in primary care and with no obvious liver disease. Further investigations may be dangerous for the patient and expensive for Health Services. The aims of this study are to determine the natural history of abnormalities in LFTs before overt liver disease presents in the population and identify those who require minimal further investigations with the potential for reduction in NHS costs

    More breast cancer genes?

    Get PDF
    A new gene associated with a high risk of breast cancer, termed BRCAX, may exist on chromosome 13q. Tumours from multicase Nordic breast cancer families, in which mutations in BRCA1 and BRCA2 had been excluded, were analyzed using comparative genomic hybridization in order to identify a region of interest, which was apparently confirmed and refined using linkage analysis on an independent sample. The present commentary discusses this work. It also asks why there should exist genetic variants associated with susceptibility to breast cancer other than mutations in BRCA1 and BRCA2, and what might be their modes of inheritance, allele frequencies and risks. Replication studies will be needed to clarify whether there really is a tumour suppressor gene other than BRCA2 on chromosome 13q

    Are the so-called low penetrance breast cancer genes, ATM, BRIP1, PALB2 and CHEK2, high risk for women with strong family histories?

    Get PDF
    A woman typically presents for genetic counselling because she has a strong family history and is interested in knowing the probability she will develop disease in the future; that is, her absolute risk. Relative risk for a given factor refers to risk compared with either population average risk (sense a), or risk when not having the factor, with all other factors held constant (sense b). Not understanding that these are three distinct concepts can result in failure to correctly appreciate the consequences of studies on clinical genetic testing. Several studies found that the frequencies of mutations in ATM, BRIP1, PALB2 and CHEK2 were many times greater for cases with a strong family history than for controls. To account for the selected case sampling (ascertainment), a statistical model that assumes that the effect of any measured variant multiplies the effect of unmeasured variants was applied. This multiplicative polygenic model in effect estimated the relative risk in the sense b, not sense a, and found it was in the range of 1.7 to 2.4. The authors concluded that the variants are "low penetrance". They failed to note that their model fits predicted that, for some women, absolute risk may be as high as for BRCA2 mutation carriers. This is because the relative risk multiplies polygenic risk, and the latter is predicted by family history. Therefore, mutation testing of these genes for women with a strong family history, especially if it is of early onset, may be as clinically relevant as it is for BRCA1 and BRCA2

    Novel Prognostic and Therapeutic Targets for Oral Squamous Cell Carcinoma

    Get PDF
    In oral squamous cell carcinoma (OSCC), metastasis to lymph nodes is associated with a 50% reduction in 5-year survival. To identify a metastatic gene set based on DNA copy number abnormalities (CNAs) of differentially expressed genes, we compared DNA and RNA of OSCC cells laser-microdissected from non-metastatic primary tumors (n = 17) with those from lymph node metastases (n = 20), using Affymetrix 250K Nsp single-nucleotide polymorphism (SNP) arrays and U133 Plus 2.0 arrays, respectively. With a false discovery rate (FDR)<5%, 1988 transcripts were found to be differentially expressed between primary and metastatic OSCC. Of these, 114 were found to have a significant correlation between DNA copy number and gene expression (FDR<0.01). Among these 114 correlated transcripts, the corresponding genomic regions of each of 95 transcripts had CNAs differences between primary and metastatic OSCC (FDR<0.01). Using an independent dataset of 133 patients, multivariable analysis showed that the OSCC-specific and overall mortality hazards ratio (HR) for patients carrying the 95-transcript signature were 4.75 (95% CI: 2.03-11.11) and 3.45 (95% CI: 1.84-6.50), respectively. To determine the degree by which these genes impact cell survival, we compared the growth of five OSCC cell lines before and after knockdown of over-amplified transcripts via a high-throughput siRNA-mediated screen. The expression-knockdown of 18 of the 26 genes tested showed a growth suppression ≄ 30% in at least one cell line (P<0.01). In particular, cell lines derived from late-stage OSCC were more sensitive to the knockdown of G3BP1 than cell lines derived from early-stage OSCC, and the growth suppression was likely caused by increase in apoptosis. Further investigation is warranted to examine the biological role of these genes in OSCC progression and their therapeutic potentials

    How to handle mortality when investigating length of hospital stay and time to clinical stability

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Hospital length of stay (LOS) and time for a patient to reach clinical stability (TCS) have increasingly become important outcomes when investigating ways in which to combat Community Acquired Pneumonia (CAP). Difficulties arise when deciding how to handle in-hospital mortality. Ad-hoc approaches that are commonly used to handle time to event outcomes with mortality can give disparate results and provide conflicting conclusions based on the same data. To ensure compatibility among studies investigating these outcomes, this type of data should be handled in a consistent and appropriate fashion.</p> <p>Methods</p> <p>Using both simulated data and data from the international Community Acquired Pneumonia Organization (CAPO) database, we evaluate two ad-hoc approaches for handling mortality when estimating the probability of hospital discharge and clinical stability: 1) restricting analysis to those patients who lived, and 2) assigning individuals who die the "worst" outcome (right-censoring them at the longest recorded LOS or TCS). Estimated probability distributions based on these approaches are compared with right-censoring the individuals who died at time of death (the complement of the Kaplan-Meier (KM) estimator), and treating death as a competing risk (the cumulative incidence estimator). Tests for differences in probability distributions based on the four methods are also contrasted.</p> <p>Results</p> <p>The two ad-hoc approaches give different estimates of the probability of discharge and clinical stability. Analysis restricted to patients who survived is conceptually problematic, as estimation is conditioned on events that happen <it>at a future time</it>. Estimation based on assigning those patients who died the worst outcome (longest LOS and TCS) coincides with the complement of the KM estimator based on the subdistribution hazard, which has been previously shown to be equivalent to the cumulative incidence estimator. However, in either case the time to in-hospital mortality is ignored, preventing simultaneous assessment of patient mortality in addition to LOS and/or TCS. The power to detect differences in underlying hazards of discharge between patient populations differs for test statistics based on the four approaches, and depends on the underlying hazard ratio of mortality between the patient groups.</p> <p>Conclusions</p> <p>Treating death as a competing risk gives estimators which address the clinical questions of interest, and allows for simultaneous modelling of both in-hospital mortality and TCS / LOS. This article advocates treating mortality as a competing risk when investigating other time related outcomes.</p

    Physical fitness and dementia risk in the very old:A study of the Lothian Birth Cohort 1921

    Get PDF
    Abstract Background Previous studies have demonstrated that individual measures of fitness – such as reduced pulmonary function, slow walking speed and weak handgrip – are associated with an increased risk of dementia. Only a minority of participants included in these studies were aged over 80. The aim of this study was therefore to investigate the association between physical fitness and dementia in the oldest old. Methods Subjects (n = 488) were enrolled in the Lothian Birth Cohort 1921 and aged 79 at baseline. Dementia cases arising after enrolment were determined using data from death certificates, electronic patient records and clinical reviews. Fitness measures included grip strength, forced expiratory volume in 1 s (FEV1) and walking speed over 6 m, measured at 79 years. Dementia risk associated with each fitness variable was initially determined by logistic regression analysis, followed by Cox regression analysis, where death was considered as a competing risk. APOE Δ4 status, age, sex, height, childhood IQ, smoking, history of cardiovascular or cerebrovascular disease, hypertension and diabetes were included as additional variables. Cumulative incidence graphs were calculated using Aalen-Johansen Estimator. Results Although initial results indicated that greater FEV1 was associated with an increased risk of dementia (OR (odds ratio per unit increase) 1.93, p = 0.03, n = 416), taking into account the competing risk of mortality, none of the fitness measures were found to be associated with dementia; FEV1 (HR (hazard ratio per unit increase) 1.30, p = 0.37, n = 416), grip strength (HR 0.98, p = 0.35, n = 416), walking speed (HR 0.99, p = 0.90, n = 416). The presence of an APOE ɛ4 allele was however an important predictor for dementia (HR 2.85, p < 0.001, n = 416). Cumulative incidence graphs supported these findings, with an increased risk of dementia for APOE ɛ4 carriers compared with non-carriers. While increased FEV1 was associated with reduced risk of death, there was no reduction in risk for dementia. Conclusions In contrast to previous studies, this study found that lower fitness beyond age 79 was not a risk factor for subsequent dementia. This finding is not explained by those with poorer physical fitness, who would have been more likely to develop dementia, having died before onset of dementia symptoms

    A molecular analysis of desiccation tolerance mechanisms in the anhydrobiotic nematode Panagrolaimus superbus using expressed sequenced tags

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Some organisms can survive extreme desiccation by entering into a state of suspended animation known as anhydrobiosis. <it>Panagrolaimus superbus </it>is a free-living anhydrobiotic nematode that can survive rapid environmental desiccation. The mechanisms that <it>P. superbus </it>uses to combat the potentially lethal effects of cellular dehydration may include the constitutive and inducible expression of protective molecules, along with behavioural and/or morphological adaptations that slow the rate of cellular water loss. In addition, inducible repair and revival programmes may also be required for successful rehydration and recovery from anhydrobiosis.</p> <p>Results</p> <p>To identify constitutively expressed candidate anhydrobiotic genes we obtained 9,216 ESTs from an unstressed mixed stage population of <it>P. superbus</it>. We derived 4,009 unigenes from these ESTs. These unigene annotations and sequences can be accessed at <url>http://www.nematodes.org/nembase4/species_info.php?species=PSC</url>. We manually annotated a set of 187 constitutively expressed candidate anhydrobiotic genes from <it>P. superbus</it>. Notable among those is a putative lineage expansion of the <it>lea </it>(late embryogenesis abundant) gene family. The most abundantly expressed sequence was a member of the nematode specific <it>sxp/ral-2 </it>family that is highly expressed in parasitic nematodes and secreted onto the surface of the nematodes' cuticles. There were 2,059 novel unigenes (51.7% of the total), 149 of which are predicted to encode intrinsically disordered proteins lacking a fixed tertiary structure. One unigene may encode an exo-ÎČ-1,3-glucanase (GHF5 family), most similar to a sequence from <it>Phytophthora infestans</it>. GHF5 enzymes have been reported from several species of plant parasitic nematodes, with horizontal gene transfer (HGT) from bacteria proposed to explain their evolutionary origin. This <it>P. superbus </it>sequence represents another possible HGT event within the Nematoda. The expression of five of the 19 putative stress response genes tested was upregulated in response to desiccation. These were the antioxidants <it>glutathione peroxidase, dj-1 </it>and <it>1-Cys peroxiredoxin</it>, an <it>shsp </it>sequence and an <it>lea </it>gene.</p> <p>Conclusions</p> <p><it>P. superbus </it>appears to utilise a strategy of combined constitutive and inducible gene expression in preparation for entry into anhydrobiosis. The apparent lineage expansion of <it>lea </it>genes, together with their constitutive and inducible expression, suggests that LEA3 proteins are important components of the anhydrobiotic protection repertoire of <it>P. superbus</it>.</p

    Left ventricular regional glucose metabolism in combination with septal scar extent identifies CRT responders.

    No full text
    Cardiac resynchronization therapy (CRT) is effective in selective heart failure (HF) patients, but non-response rate remains high. Positron emission tomography (PET) may provide a better insight into the pathophysiology of left ventricular (LV) remodeling; however, its role for evaluating and selecting patients for CRT remains uncertain. We investigated if regional LV glucose metabolism in combination with myocardial scar could predict response to CRT. Consecutive CRT-eligible HF patients underwent echocardiography, cardiac magnetic resonance (CMR), and F-fluorodeoxyglucose (FDG) PET within 1 week before CRT implantation. Echocardiography was additionally performed 12 months after CRT and end-systolic volume reduction ≄ 15% was defined as CRT response. Septal-to-lateral wall (SLR) FDG uptake ratio was calculated from static FDG images. Late gadolinium enhancement (LGE) CMR was analyzed semi-quantitatively to define scar extent. We evaluated 88 patients (67 ± 10 years, 72% males). F-FDG SLR showed a linear correlation with volumetric reverse remodeling 12 months after CRT (r = 0.41, p = 0.0001). In non-ischemic HF patients, low FDG SLR alone predicted CRT response with sensitivity and specificity of more than 80%; however, in ischemic HF patients, specificity decreased to 46%, suggesting that in this cohort low SLR can also be caused by the presence of a septal scar. In the multivariate logistic regression model, including low FDG SLR, presence and extent of the scar in each myocardial wall, and current CRT guideline parameters, only low FDG SLR and septal scar remained associated with CRT response. Their combination could predict CRT response with sensitivity, specificity, negative, and positive predictive value of 80%, 83%, 70%, and 90%, respectively. FDG SLR can be used as a predictor of CRT response and combined with septal scar extent, CRT responders can be distinguished from non-responders with high diagnostic accuracy. Further studies are needed to verify whether this imaging approach can prospectively be used to optimize patient selection

    Lateral Wall Dysfunction Signals Onset of Progressive Heart Failure in Left Bundle Branch Block

    No full text
    International audienceOBJECTIVES: This study sought to investigate if contractile asymmetry between septum and left ventricular (LV) lateral wall drives heart failure development in patients with left bundle branch block (LBBB) and whether the presence of lateral wall dysfunction affects potential for recovery of LV function with cardiac resynchronization therapy (CRT). BACKGROUND: LBBB may induce or aggravate heart failure. Understanding the underlying mechanisms is important to optimize timing of CRT. METHODS: In 76 nonischemic patients with LBBB and 11 controls, we measured strain using speckle-tracking echocardiography and regional work using pressure-strain analysis. Patients with LBBB were stratified according to LV ejection fraction (EF) ≄50% (EF(preserved)), 36% to 49% (EF(mid)), and ≀35% (EF(low)). Sixty-four patients underwent CRT and were re-examined after 6 months. RESULTS: Septal work was successively reduced from controls, through EF(preserved), EF(mid), and EF(low) (all p &lt; 0.005), and showed a strong correlation to left ventricular ejection fraction (LVEF; r = 0.84; p &lt; 0.005). In contrast, LV lateral wall work was numerically increased in EF(preserved) and EF(mid) versus controls, and did not significantly correlate with LVEF in these groups. In EF(low,) however, LV lateral wall work was substantially reduced (p &lt; 0.005). There was a moderate overall correlation between LV lateral wall work and LVEF (r = 0.58; p &lt; 0.005). In CRT recipients, LVEF was normalized (≄50%) in 54% of patients with preserved LV lateral wall work, but only in 13% of patients with reduced LV lateral wall work (p &lt; 0.005). CONCLUSIONS: In early stages, LBBB-induced heart failure is associated with impaired septal function but preserved lateral wall function. The advent of LV lateral wall dysfunction may be an optimal time-point for CRT
    corecore