306 research outputs found

    The impact of population-based faecal occult blood test screening on colorectal cancer mortality:a matched cohort study

    Get PDF
    BACKGROUND: Randomised trials show reduced colorectal cancer (CRC) mortality with faecal occult blood testing (FOBT). This outcome is now examined in a routine, population-based, screening programme. METHODS: Three biennial rounds of the UK CRC screening pilot were completed in Scotland (2000–2007) before the roll out of a national programme. All residents (50–69 years) in the three pilot Health Boards were invited for screening. They received a FOBT test by post to complete at home and return for analysis. Positive tests were followed up with colonoscopy. Controls, selected from non-pilot Health Boards, were matched by age, gender, and deprivation and assigned the invitation date of matched invitee. Follow-up was from invitation date to 31 December 2009 or date of death if earlier. RESULTS: There were 379 655 people in each group (median age 55.6 years, 51.6% male). Participation was 60.6%. There were 961 (0.25%) CRC deaths in invitees, 1056 (0.28%) in controls, rate ratio (RR) 0.90 (95% confidence interval (CI) 0.83–0.99) overall and 0.73 (95% CI 0.65–0.82) for participants. Non-participants had increased CRC mortality compared with controls, RR 1.21 (95% CI 1.06–1.38). CONCLUSION: There was a 10% relative reduction in CRC mortality in a routine screening programme, rising to 27% in participants

    Peri‐operative cardiac arrest in children as reported to the 7th National Audit Project of the Royal College of Anaesthetists

    Get PDF
    The 7th National Audit Project of the Royal College of Anaesthetists studied peri‐operative cardiac arrest. An activity survey estimated UK paediatric anaesthesia annual caseload as 390,000 cases, 14% of the UK total. Paediatric peri‐operative cardiac arrests accounted for 104 (12%) reports giving an incidence of 3 in 10,000 anaesthetics (95%CI 2.2–3.3 per 10,000). The incidence of peri‐operative cardiac arrest was highest in neonates (27, 26%), infants (36, 35%) and children with congenital heart disease (44, 42%) and most reports were from tertiary centres (88, 85%). Frequent precipitants of cardiac arrest in non‐cardiac surgery included: severe hypoxaemia (20, 22%); bradycardia (10, 11%); and major haemorrhage (9, 8%). Cardiac tamponade and isolated severe hypotension featured prominently as causes of cardiac arrest in children undergoing cardiac surgery or cardiological procedures. Themes identified at review included: inappropriate choices and doses of anaesthetic drugs for intravenous induction; bradycardias associated with high concentrations of volatile anaesthetic agent or airway manipulation; use of atropine in the place of adrenaline; and inadequate monitoring. Overall quality of care was judged by the panel to be good in 64 (62%) cases, which compares favourably with adults (371, 52%). The study provides insight into paediatric anaesthetic practice, complications and peri‐operative cardiac arrest

    Population screening for colorectal cancer: the implications of an ageing population

    Get PDF
    Population screening for colorectal cancer (CRC) has recently commenced in the United Kingdom supported by the evidence of a number of randomised trials and pilot studies. Certain factors are known to influence screening cost-effectiveness (e.g. compliance), but it remains unclear whether an ageing population (i.e. demographic change) might also have an effect. The aim of this study was to simulate a population-based screening setting using a Markov model and assess the effect of increasing life expectancy on CRC screening cost-effectiveness. A Markov model was constructed that aimed, using a cohort simulation, to estimate the cost-effectiveness of CRC screening in an England and Wales population for two timescales: 2003 (early cohort) and 2033 (late cohort). Four model outcomes were calculated; screened and non-screened cohorts in 2003 and 2033. The screened cohort of men and women aged 60 years were offered biennial unhydrated faecal occult blood testing until the age of 69 years. Life expectancy was assumed to increase by 2.5 years per decade. There were 407 552 fewer people entering the model in the 2033 model due to a lower birth cohort, and population screening saw 30 345 fewer CRC-related deaths over the 50 years of the model. Screening the 2033 cohort cost £96 million with cost savings of £43 million in terms of detection and treatment and £28 million in palliative care costs. After 30 years of follow-up, the cost per life year saved was £1544. An identical screening programme in an early cohort (2003) saw a cost per life year saved of £1651. Population screening for CRC is costly but enables cost savings in certain areas and a considerable reduction in mortality from CRC. This Markov simulation suggests that the cost-effectiveness of population screening for CRC in the United Kingdom may actually be improved by rising life expectancies

    Assessing the role of EO in biodiversity monitoring: options for integrating in-situ observations with EO within the context of the EBONE concept

    Get PDF
    The European Biodiversity Observation Network (EBONE) is a European contribution on terrestrial monitoring to GEO BON, the Group on Earth Observations Biodiversity Observation Network. EBONE’s aims are to develop a system of biodiversity observation at regional, national and European levels by assessing existing approaches in terms of their validity and applicability starting in Europe, then expanding to regions in Africa. The objective of EBONE is to deliver: 1. A sound scientific basis for the production of statistical estimates of stock and change of key indicators; 2. The development of a system for estimating past changes and forecasting and testing policy options and management strategies for threatened ecosystems and species; 3. A proposal for a cost-effective biodiversity monitoring system. There is a consensus that Earth Observation (EO) has a role to play in monitoring biodiversity. With its capacity to observe detailed spatial patterns and variability across large areas at regular intervals, our instinct suggests that EO could deliver the type of spatial and temporal coverage that is beyond reach with in-situ efforts. Furthermore, when considering the emerging networks of in-situ observations, the prospect of enhancing the quality of the information whilst reducing cost through integration is compelling. This report gives a realistic assessment of the role of EO in biodiversity monitoring and the options for integrating in-situ observations with EO within the context of the EBONE concept (cfr. EBONE-ID1.4). The assessment is mainly based on a set of targeted pilot studies. Building on this assessment, the report then presents a series of recommendations on the best options for using EO in an effective, consistent and sustainable biodiversity monitoring scheme. The issues that we faced were many: 1. Integration can be interpreted in different ways. One possible interpretation is: the combined use of independent data sets to deliver a different but improved data set; another is: the use of one data set to complement another dataset. 2. The targeted improvement will vary with stakeholder group: some will seek for more efficiency, others for more reliable estimates (accuracy and/or precision); others for more detail in space and/or time or more of everything. 3. Integration requires a link between the datasets (EO and in-situ). The strength of the link between reflected electromagnetic radiation and the habitats and their biodiversity observed in-situ is function of many variables, for example: the spatial scale of the observations; timing of the observations; the adopted nomenclature for classification; the complexity of the landscape in terms of composition, spatial structure and the physical environment; the habitat and land cover types under consideration. 4. The type of the EO data available varies (function of e.g. budget, size and location of region, cloudiness, national and/or international investment in airborne campaigns or space technology) which determines its capability to deliver the required output. EO and in-situ could be combined in different ways, depending on the type of integration we wanted to achieve and the targeted improvement. We aimed for an improvement in accuracy (i.e. the reduction in error of our indicator estimate calculated for an environmental zone). Furthermore, EO would also provide the spatial patterns for correlated in-situ data. EBONE in its initial development, focused on three main indicators covering: (i) the extent and change of habitats of European interest in the context of a general habitat assessment; (ii) abundance and distribution of selected species (birds, butterflies and plants); and (iii) fragmentation of natural and semi-natural areas. For habitat extent, we decided that it did not matter how in-situ was integrated with EO as long as we could demonstrate that acceptable accuracies could be achieved and the precision could consistently be improved. The nomenclature used to map habitats in-situ was the General Habitat Classification. We considered the following options where the EO and in-situ play different roles: using in-situ samples to re-calibrate a habitat map independently derived from EO; improving the accuracy of in-situ sampled habitat statistics, by post-stratification with correlated EO data; and using in-situ samples to train the classification of EO data into habitat types where the EO data delivers full coverage or a larger number of samples. For some of the above cases we also considered the impact that the sampling strategy employed to deliver the samples would have on the accuracy and precision achieved. Restricted access to European wide species data prevented work on the indicator ‘abundance and distribution of species’. With respect to the indicator ‘fragmentation’, we investigated ways of delivering EO derived measures of habitat patterns that are meaningful to sampled in-situ observations

    Human blood autoantibodies in the detection of colorectal cancer

    Get PDF
    Colorectal cancer (CRC) is the second most common malignancy in the western world. Early detection and diagnosis of all cancer types is vital to improved prognosis by enabling early treatment when tumours should be both resectable and curable. Sera from 3 different cohorts; 42 sera (21 CRC and 21 matched controls) from New York, USA, 200 sera from Pittsburgh, USA (100 CRC and 100 controls) and 20 sera from Dundee, UK (10 CRC and 10 controls) were tested against a panel of multiple tumour-associated antigens (TAAs) using an optimised multiplex microarray system. TAA specific IgG responses were interpo- lated against the internal IgG standard curve for each sample. Individual TAA specific responses were examined in each cohort to determine cutoffs for a robust initial scoring method to establish sensitivity and specificity. Sensitivity and specificity of combinations of TAAs provided good discrimination between cancer-positive and normal serum. The overall sensitivity and specificity of the sample sets tested against a panel of 32 TAAs were 61.1% and 80.9% respectively for 6 antigens; p53, AFP, K RAS, Annexin, RAF1 and NY-CO16. Furthermore, the observed sensitivity in Pittsburgh sample set in different clinical stages of CRC;stageI(n=19),stageII(n=40),stageIII(n=34)andstageIV(n=6)wassimilar (73.6%, 75.0%, 73.5% and 83.3%, respectively), with similar levels of sensitivity for right and left sided CRC. We identified an antigen panel of sufficient sensitivity and specificity for early detection of CRC, based upon serum profiling of autoantibody response using a robust multiplex antigen microarray technology. This opens the possibility of a blood test for screening and detection of early colorectal cancer. However this panel will require further validation studies before they can be proposed for clinical practice

    Slow development of woodland vegetation and bird communities during 33 years of passive rewilding in open farmland

    Get PDF
    Passive rewilding is a potential tool for expanding woodland cover and restoring biodiversity by abandoning land management and allowing natural vegetation succession to occur. Land can be abandoned to passive rewilding deliberately or due to socio-economic change. Despite abandonment being a major driver of land use change, few have studied the long-term outcomes for vegetation and biodiversity in Western Europe. Studies are also biased towards sites that are close to seed sources and favourable to woodland colonisation. In this case-study, we reconstruct a time series of passive rewilding over 33 years on 25 ha of former farmland that had been subject to soil tipping, far from woodland seed sources. Natural colonisation by shrubs and trees was surveyed at three points during the time series, using field mapping and lidar. Breeding birds were surveyed at three time points, and compared with surveys from nearby farmland. Results showed that natural colonisation of woody vegetation was slow, with open grassland dominating the old fields for two decades, and small wetlands developing spontaneously. After 33 years, thorny shrub thickets covered 53% of the site and former hedgerows became subsumed or degraded, but trees remained scarce. However, the resulting habitat mosaic of shrubland, grassland and wetland supported a locally distinctive bird community. Farmland bird species declined as passive rewilding progressed, but this was countered by relatively more wetland birds and an increase in woodland birds, particularly songbirds, compared to nearby farmland. Alongside biodiversity benefits, shrubland establishment by passive rewilding could potentially provide ecosystem services via abundant blossom resources for pollinators, and recreation and berry-gathering opportunities for people. Although closed-canopy woodland remained a distant prospect even after 33 years, the habitat mosaic arising from passive rewilding could be considered a valuable outcome, which could contribute to nature recovery and provision of ecosystem services

    Optimal treatment duration of glyceryl trinitrate for chronic anal fissure: results of a prospective randomized multicenter trial

    Get PDF
    Background: Chronic anal fissure (CAF) is a painful condition that is unlikely to resolve with conventional conservative management. Previous studies have reported that topical treatment of CAF with glyceryl trinitrate (GTN) reduces pain and promotes healing, but optimal treatment duration is unknown. Methods: To assess the effect of different treatment durations on CAF, we designed a prospective randomized trial comparing 40 versus 80 days with twice daily topical 0.4% GTN treatment (Rectogesic®, Prostrakan Group). Chronicity was defined by the presence of both morphological (fibrosis, skin tag, exposed sphincter, hypertrophied anal papilla) and time criteria (symptoms present for more than 2 months or pain of less duration but similar episodes in the past). A gravity score (1 = no visible sphincter; 2 = visible sphincter; 3 = visible sphincter and fibrosis) was used at baseline. Fissure healing, the primary endpoint of the study, maximum pain at defecation measured with VAS and maximum anal resting pressure were assessed at baseline and at 14, 28, 40 and 80 days. Data was gathered at the end of the assigned treatment. Results: Of 188 patients with chronic fissure, 96 were randomized to the 40-day group and 92 to the 80-day group. Patients were well matched for sex, age, VAS and fissure score. There were 34 (19%) patients who did not complete treatment, 18 (10%) because of side effects. Of 154 patients who completed treatment, 90 (58%) had their fissures healed and 105 (68%) were pain free. There was no difference in healing or symptoms between the 40- and the 80-day group. There was no predictor of fissure healing. A low fissure gravity score correlated with increased resolution of pain (P < 0.05) and improvement of VAS score (P < 0.05) on both univariate and multivariate analysis. A lower baseline resting pressure was associated with better pain resolution on univariate analysis (P < 0.01). VAS at defecation and fissure healing significantly improved until 40 days (P < 0.001), while the difference between 40 and 80 days was not significant. Conclusion: We found no benefits in treating CAF with topical GTN for 80 days compared to 40 days. Fissure healing and VAS improvement continue until 6 weeks of treatment but are unlikely thereafter. © 2010 Springer-Verlag
    corecore