246 research outputs found

    A provisional database for the silicon content of foods in the United Kingdom

    Full text link
    Si may play an important role in bone formation and connective tissue metabolism. Although biological interest in this element has recently increased, limited literature exists on the Si content of foods. To further our knowledge and understanding of the relationship between dietary Si and human health, a reliable food composition database, relevant for the UK population, is required. A total of 207 foods and beverages, commonly consumed in the UK, were analysed for Si content. Composite samples were analysed using inductively coupled plasma&ndash;optical emission spectrometry following microwave-assisted digestion with nitric acid and H2O2. The highest concentrations of Si were found in cereals and cereal products, especially less refined cereals and oat-based products. Fruit and vegetables were highly variable sources of Si with substantial amounts present in Kenyan beans, French beans, runner beans, spinach, dried fruit, bananas and red lentils, but undetectable amounts in tomatoes, oranges and onions. Of the beverages, beer, a macerated whole-grain cereal product, contained the greatest level of Si, whilst drinking water was a variable source with some mineral waters relatively high in Si. The present study provides a provisional database for the Si content of UK foods, which will allow the estimation of dietary intakes of Si in the UK population and investigation into the role of dietary Si in human health.<br /

    From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument

    Get PDF
    &lt;b&gt;Background&lt;/b&gt; Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Methods&lt;/b&gt; A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Results&lt;/b&gt; The developed instrument was pre-tested in two professional samples (N = 46; N = 231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts.&lt;p&gt;&lt;/p&gt; &lt;b&gt;Conclusions&lt;/b&gt; To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study

    Measures for assessing practice change in medical practitioners

    Get PDF
    BACKGROUND: There are increasing numbers of randomised trials and systematic reviews examining the efficacy of interventions designed to bring about a change in clinical practice. The findings of this research are being used to guide strategies to increase the uptake of evidence into clinical practice. Knowledge of the outcomes measured by these trials is vital not only for the interpretation and application of the work done to date, but also to inform future research in this expanding area of endeavour and to assist in collation of results in systematic reviews and meta-analyses. METHODS: The objective of this review was to identify methods used to measure change in the clinical practices of health professionals following an intervention aimed at increasing the uptake of evidence into practice. All published trials included in a recent, comprehensive Health Technology Assessment of interventions to implement clinical practice guidelines and change clinical practice (n = 228) formed the sample for this study. Using a standardised data extraction form, one reviewer (SH), extracted the relevant information from the methods and/or results sections of the trials. RESULTS: Measures of a change of health practitioner behaviour were the most common, with 88.8% of trials using these as outcome measures. Measures that assessed change at a patient level, either actual measures of change or surrogate measures of change, were used in 28.8% and 36.7% of studies (respectively). Health practitioners' knowledge and attitudes were assessed in 22.8% of the studies and changes at an organisational level were assessed in 17.6%. CONCLUSION: Most trials of interventions aimed at changing clinical practice measured the effect of the intervention at the level of the practitioner, i.e. did the practitioner change what they do, or has their knowledge of and/or attitude toward that practice changed? Less than one-third of the trials measured, whether or not any change in practice, resulted in a change in the ultimate end-point of patient health status

    The clinical effectiveness and cost-effectiveness of screening for open angle glaucoma : a systematic review and economic evaluation

    Get PDF
    Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3–4% in 40 year olds with a screening interval of 10 years to approach costeffectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty. In particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not costeffective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing.Peer reviewedPublisher PD

    The challenges faced in the design, conduct and analysis of surgical randomised controlled trials

    Get PDF
    Randomised evaluations of surgical interventions are rare; some interventions have been widely adopted without rigorous evaluation. Unlike other medical areas, the randomised controlled trial (RCT) design has not become the default study design for the evaluation of surgical interventions. Surgical trials are difficult to successfully undertake and pose particular practical and methodological challenges. However, RCTs have played a role in the assessment of surgical innovations and there is scope and need for greater use. This article will consider the design, conduct and analysis of an RCT of a surgical intervention. The issues will be reviewed under three headings: the timing of the evaluation, defining the research question and trial design issues. Recommendations on the conduct of future surgical RCTs are made. Collaboration between research and surgical communities is needed to address the distinct issues raised by the assessmentof surgical interventions and enable the conduct of appropriate and well-designed trials.The Health Services Research Unit is funded by the Scottish Government Health DirectoratesPeer reviewedPublisher PD

    A systematic review of the clinical effectiveness of 64-slice or higher computed tomography angiography as an alternative to invasive coronary angiography in the investigation of suspected coronary artery disease

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This systematic review summarized recent evidence pertaining to the clinical effectiveness of 64-slice or higher computed tomography angiography (CTA) in patients with suspected coronary artery disease (CAD). If CTA proves to be a successful diagnostic performance measure, it could prevent the use of invasive diagnostic procedures in some patients. This would provide multiple health and cost benefits, particularly for under resourced areas where invasive coronary angiography is not always available.</p> <p>Methods</p> <p>A systematic method of literature searching and selection was employed with searches limited to December 2006 to March 2009. Included studies were quality assessed using National Health and Medical Research Council (NHMRC) diagnostic levels of evidence and a modified Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool. Individual and pooled diagnostic performance measures were calculated using standard meta-analytic techniques at the patient, vessel and segment level. A positive result was defined as greater than or equal to 50% stenosis.</p> <p>Results</p> <p>Twenty-eight studies were included in the systematic review examining 3,674 patients. The primary meta-analysis at the patient-level indicated a sensitivity of 98.2% and specificity of 81.6%. The median (range) positive predictive value (PPV) was 90.5% (76%-100%) and negative predictive value (NPV) 99.0% (83%-100%). In all vessels, the pooled sensitivity was 94.9%, specificity 89.5%, and median (range) PPV 75.0% (53%-95%) and NPV 99.0% (93%-100%). At the individual artery level, overall diagnostic accuracy appeared to be slightly higher in the left main coronary artery and slightly lower in the left anterior descending and circumflex artery. In all segments, the sensitivity was 91.3%, specificity 94.0% and median (range) PPV 69.0% (44%-86%) and NPV 99.0% (98%-100%).</p> <p>Conclusions</p> <p>The high sensitivity indicates that CTA can effectively identify the majority of patients with significant coronary artery stenosis. The high NPV at the patient, vessel and segment level establishes CTA as an effective non-invasive alternative to invasive coronary angiography (ICA) for the exclusion of stenosis.</p

    Transcriptome analyses of the Giardia lamblia life cycle

    Get PDF
    Author Posting. © The Author(s), 2010. This is the author's version of the work. It is posted here by permission of Elsevier B.V. for personal use, not for redistribution. The definitive version was published in Molecular and Biochemical Parasitology 174 (2010): 62-65, doi:10.1016/j.molbiopara.2010.05.010.We quantified mRNA abundance from 10 stages in the Giardia lamblia life cycle in vitro using Serial Analysis of Gene Expression (SAGE). 163 abundant transcripts were expressed constitutively. 71 transcripts were upregulated specifically during excystation and 42 during encystation. Nonetheless, the transcriptomes of cysts and trophozoites showed major differences. SAGE detected co-expressed clusters of 284 transcripts differentially expressed in cysts and excyzoites and 287 transcripts in vegetative trophozoites and encysting cells. All clusters included known genes and pathways as well as proteins unique to Giardia or diplomonads. SAGE analysis of the Giardia life cycle identified a number of kinases, phosphatases, and DNA replication proteins involved in excystation and encystation, which could be important for examining the roles of cell signaling in giardial differentiation. Overall, these data pave the way for directed gene discovery and a better understanding of the biology of Giardia lamblia.BJD, DSR, and FDG were supported by NIH grants AI42488, GM61896, DK35108, and AI051687. DP and SGS were supported by grants from the Swedish Natural Science Research Council, the Swedish Medical Research Council, and the Karolinska Institutet. AGM, SRB, SPP, and MJC were supported by NIH grant AI51089 and by the Marine Biological Laboratory’s Program in Global Infectious Diseases, funded by the Ellison Medical Foundation

    Urinary EpCAM in urothelial bladder cancer patients: characterisation and evaluation of biomarker potential

    Get PDF
    Background: Epithelial cell adhesion molecule is overexpressed in bladder tumours and released from bladder cancer cells in vitro. We test the hypotheses that urinary EpCAM could act as a biomarker for primary bladder cancer detection and risk stratification. Methods: Epithelial cell adhesion molecule was measured by ELISA in urine from 607 patients with primary bladder tumours and in urine from 53 non-cancer controls. Mann–Whitney tests and ROC analyses were used to determine statistical significance and discrimination between non-cancer controls and different stages and grades of disease. Multivariable modelling and Kaplan–Meier analyses were used to determine prognostic significance. The structure of urinary EpCAM was investigated by western blotting and mass spectrometry. Results: Urinary EpCAM levels increase with stage and grade of bladder cancer. Alongside grade and stage, elevated urinary EpCAM is an independent indicator of poor prognosis with a hazard ratio of 1.76 for bladder cancer-specific mortality. The soluble form of EpCAM in urine is the extracellular domain generated by cleavage between ala243 and gly244. Further studies are required to define the influence of other urinary tract malignancies and benign urological conditions on urinary EpCAM. Conclusion: The extracellular domain of EpCAM is shed into urine by bladder tumours. Urinary EpCAM is a strong indicator of bladder cancer-specific survival, and may be useful within a multi-marker panel for disease detection or as a stand-alone marker to prioritise the investigation and treatment of patients. The mechanisms and effects of EpCAM cleavage in bladder cancer are worthy of further investigation, and may identify novel therapeutic targets
    corecore