1,863 research outputs found

    Evaluation of the accuracy of serum MMP-9 as a test for colorectal cancer in a primary care population

    Get PDF
    Background Bowel cancer is common and is a major cause of death. Meta-analysis of randomised controlled trials estimates that screening for colorectal cancer using faecal occult blood (FOB) test reduces mortality from colorectal cancer by 16%. However, FOB testing has a low positive predictive value, with associated unnecessary cost, risk and anxiety from subsequent investigation, and is unacceptable to a proportion of the target population. Increased levels of an enzyme called matrix metalloproteinase 9 (MMP-9) have been found to be associated with colorectal cancer, and this can be measured from a blood sample. Serum MMP-9 is potentially an accurate, low risk and cost-effective population screening tool. This study aims to evaluate the accuracy of serum MMP-9 as a test for colorectal cancer in a primary care population. Methods/Design People aged 50 to 69 years, who registered in participating general practices in the West Midlands Region, will be asked to complete a questionnaire that asks about symptoms. Respondents who describe any colorectal symptoms (except only abdominal bloating and/or anal symptoms) and are prepared to provide a blood sample for MMP9 estimation and undergo a colonoscopy (current gold standard investigation) will be recruited at GP based clinics by a research nurse. Those unfit for colonoscopy will be excluded. Colonoscopies will be undertaken in dedicated research clinics. The accuracy of MMP-9 will be assessed by comparing the MMP-9 level with the colonoscopy findings, and the combination of factors (e.g. symptoms and MMP-9 level) that best predict a diagnosis of malignancy (invasive disease or polyps) will be determined. Discussion Colorectal cancer is a major cause of morbidity and mortality. Most colorectal cancers arise from adenomas and there is a period for early detection by screening, but available tests have risks, are unacceptable to many, have high false positive rates or are expensive. This study will establish the potential of serum MMP-9 as a screening test for colorectal cancer. If it is confirmed as accurate and acceptable, this serum marker has the potential to assist with reducing the morbidity and mortality from colorectal cancer

    Effects of Tillage and Residue Management on Soil Organic Carbon and Total Nitrogen in the North China Plain

    Get PDF
    Chinese Academy of Sciences XDA050500001 KSCX1-YW-09-06;Ministry of Science and Technology of China 2004CB720501A suitable tillage-residue management system is needed in the North China Plain (NCP) that sustains soil fertility and agronomic productivity. The objectives of this study were to determine the effects of different tillage-residue managements for a winter wheat (Triticum aestivum L.) and summer maize (Zea mays L.) double-crop system on soil organic carbon (SOC) and total N pools. No-tillage with residue cover (NTR), no-tillage with residue removed and manure applied (NTRRM), and conventional tillage with residue removed (CTRR) were investigated for 6 yr, based on a uniform N application among treatments. Soil samples were collected at six depths and changes in SOC and total N pools were analyzed. Treatments of NTRRM and NTR sequestered more SOC and total N in the 0- to 5-cm depth than CTRR. In the subsoil (5-60 cm), annual SOC sequestration was 0.01 and -0.40 Mg ha(-1) yr(-1) for NTRRM and NTR, respectively, while CTRR exhibited a significantly positive SOC pool trend. In the whole soil profile (0-60 cm), NTRRM, NTR, and CTRR sequestered SOC at the rates of 0.66, 0.27 and 2.24 Mg ha(-1) yr(-1). When manure was applied to substitute for the N lost from residue removal, the NTRRM tended to accumulate more SOC than NTR, and had similar accumulation as NTR in total N pools, grain yield, and aboveground biomass. Crop residue could be substituted by manure in this double-crop, irrigated system. Conventional tillage, with residue removed, was suitable in soil fertility and agronomic productivity relative to NTRRM and NTR in the NCP

    A prospective study to assess the value of MMP-9 in improving the appropriateness of urgent referrals for colorectal cancer

    Get PDF
    Background Bowel cancer is common and is a major cause of death. Most people with bowel symptoms who meet the criteria for urgent referral to secondary care will not be found to have bowel cancer, and some people who are found to have cancer will have been referred routinely rather than urgently. If general practitioners could better identify people who were likely to have bowel cancer or conditions that may lead to bowel cancer, the pressure on hospital clinics may be reduced, enabling these patients to be seen more quickly. Increased levels of an enzyme called matrix metalloproteinase 9 (MMP-9) have been found to be associated with such conditions, and this can be measured from a blood sample. This study aims to find out whether measuring MMP-9 levels could improve the appropriateness of urgent referrals for patients with bowel symptoms. Methods People aged 18 years or older referred to a colorectal clinic will be asked to complete a questionnaire about symptoms, recent injuries or chronic illnesses (these can increase the level of matrix metalloproteinases) and family history of bowel cancer. A blood sample will be taken from people who consent to take part to assess MMP-9 levels, and the results of examination at the clinic and/or investigations arising from the clinic visit will be collected from hospital records. The accuracy of MMP-9 will be assessed by comparing the MMP-9 level with the resulting diagnosis. The combination of factors (e.g. symptoms and MMP-9 level) that best predict a diagnosis of malignancy (invasive disease or polyps) will be determined. Discussion Although guidelines are in place to facilitate referrals to colorectal clinics, symptoms alone do not adequately distinguish people with malignancy from people with benign conditions. This study will establish whether MMP-9 could assist this process. If this were the case, measurement of MMP-9 levels could be used by general practitioners to assist in the identification of people who were most likely to have bowel cancer or conditions that may lead to bowel cancer, and who should, therefore, be referred most urgently to secondary car

    A prospective cohort study of long-term cognitive changes in older Medicare beneficiaries

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Promoting cognitive health and preventing its decline are longstanding public health goals, but long-term changes in cognitive function are not well-documented. Therefore, we first examined long-term changes in cognitive function among older Medicare beneficiaries in the Survey on Assets and Health Dynamics among the Oldest Old (AHEAD), and then we identified the risk factors associated with those changes in cognitive function.</p> <p>Methods</p> <p>We conducted a secondary analysis of a prospective, population-based cohort using baseline (1993-1994) interview data linked to 1993-2007 Medicare claims to examine cognitive function at the final follow-up interview which occurred between 1995-1996 and 2006-2007. Besides traditional risk factors (i.e., aging, age, race, and education) and adjustment for baseline cognitive function, we considered the reason for censoring (entrance into managed care or death), and post-baseline continuity of care and major health shocks (hospital episodes). Residual change score multiple linear regression analysis was used to predict cognitive function at the final follow-up using data from telephone interviews among 3,021 to 4,251 (sample size varied by cognitive outcome) baseline community-dwelling self-respondents that were ≥ 70 years old, not in managed Medicare, and had at least one follow-up interview as self-respondents. Cognitive function was assessed using the 7-item Telephone Interview for Cognitive Status (TICS-7; general mental status), and the 10-item immediate and delayed (episodic memory) word recall tests.</p> <p>Results</p> <p>Mean changes in the number of correct responses on the TICS-7, and 10-item immediate and delayed word recall tests were -0.33, -0.75, and -0.78, with 43.6%, 54.9%, and 52.3% declining and 25.4%, 20.8%, and 22.9% unchanged. The main and most consistent risks for declining cognitive function were the baseline values of cognitive function (reflecting substantial regression to the mean), aging (a strong linear pattern of increased decline associated with greater aging, but with diminishing marginal returns), older age at baseline, dying before the end of the study period, lower education, and minority status.</p> <p>Conclusions</p> <p>In addition to aging, age, minority status, and low education, substantial and differential risks for cognitive change were associated with sooner vs. later subsequent death that help to clarify the terminal drop hypothesis. No readily modifiable protective factors were identified.</p

    Increasing the frequency of hand washing by healthcare workers does not lead to commensurate reductions in staphylococcal infection in a hospital ward

    Get PDF
    Hand hygiene is generally considered to be the most important measure that can be applied to prevent the spread of healthcare-associated infection (HAI). Continuous emphasis on this intervention has lead to the widespread opinion that HAI rates can be greatly reduced by increased hand hygiene compliance alone. However, this assumes that the effectiveness of hand hygiene is not constrained by other factors and that improved compliance in excess of a given level, in itself, will result in a commensurate reduction in the incidence of HAI. However, several researchers have found the law of diminishing returns to apply to hand hygiene, with the greatest benefits occurring in the first 20% or so of compliance, and others have demonstrated that poor cohorting of nursing staff profoundly influences the effectiveness of hand hygiene measures. Collectively, these findings raise intriguing questions about the extent to which increasing compliance alone can further reduce rates of HAI. In order to investigate these issues further, we constructed a deterministic Ross-Macdonald model and applied it to a hypothetical general medical ward. In this model the transmission of staphylococcal infection was assumed to occur after contact with the transiently colonized hands of HCWs, who, in turn, acquire contamination only by touching colonized patients. The aim of the study was to evaluate the impact of imperfect hand cleansing on the transmission of staphylococcal infection and to identify, whether there is a limit, above which further hand hygiene compliance is unlikely to be of benefit. The model demonstrated that if transmission is solely via the hands of HCWs, it should, under most circumstances, be possible to prevent outbreaks of staphylococcal infection from occurring at a hand cleansing frequencies <50%, even with imperfect hand hygiene. The analysis also indicated that the relationship between hand cleansing efficacy and frequency is not linear - as efficacy decreases, so the hand cleansing frequency required to ensure R0<1 increases disproportionately. Although our study confirmed hand hygiene to be an effective control measure, it demonstrated that the law of diminishing returns applies, with the greatest benefit derived from the first 20% or so of compliance. Indeed, our analysis suggests that there is little benefit to be accrued from very high levels of hand cleansing and that in most situations compliance >40% should be enough to prevent outbreaks of staphylococcal infection occurring, if transmission is solely via the hands of HCWs. Furthermore we identified a non-linear relationship between hand cleansing efficacy and frequency, suggesting that it is important to maximise the efficacy of the hand cleansing process

    Changes over time in the "healthy soldier effect"

    Get PDF
    Background: Death rates in military populations outside of combat are often lower than those in the general population. This study considers how this "healthy soldier effect" changes over time.Methods: Standardized mortality ratios were used to compare changes in death rates relative to the Australian population in two large studies of Australian servicemen of the Korean War (n = 17,381) and the Vietnam War era (n = 83,908).Results: The healthy soldier effect was most consistently observed in deaths from circulatory diseases. A large deficit in these deaths in the initial follow-up period (10-20 years) was observed before rates tended to rise to the level seen in the general population. There was no healthy soldier effect in deaths from external causes in enlisted personnel, and these death rates were significantly higher than expected in the initial follow-up period among Korean War veterans and regular Army veterans of the Vietnam War. Those selected for national service during the Vietnam War exhibited the strongest healthy soldier effect of all cohorts assessed.Conclusions: Patterns of the healthy soldier effect over time varied markedly by study cohort and by cause of death studied. In a number of analyses, the healthy soldier effect was still apparent after more than 30 years of follow-up

    Long-term declines in ADLs, IADLs, and mobility among older Medicare beneficiaries

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Most prior studies have focused on short-term (≤ 2 years) functional declines. But those studies cannot address aging effects inasmuch as all participants have aged the same amount. Therefore, the authors studied the extent of long-term functional decline in older Medicare beneficiaries who were followed for varying time lengths, and the authors also identified the risk factors associated with those declines.</p> <p>Methods</p> <p>The analytic sample included 5,871 self- or proxy-respondents who had complete baseline and follow-up survey data that could be linked to their Medicare claims for 1993-2007. Functional status was assessed using activities of daily living (ADLs), instrumental ADLs (IADLs), and mobility limitations, with declines defined as the development of two of more new difficulties. Multiple logistic regression analysis was used to focus on the associations involving respondent status, health lifestyle, continuity of care, managed care status, health shocks, and terminal drop.</p> <p>Results</p> <p>The average amount of time between the first and final interviews was 8.0 years. Declines were observed for 36.6% on ADL abilities, 32.3% on IADL abilities, and 30.9% on mobility abilities. Functional decline was more likely to occur when proxy-reports were used, and the effects of baseline function on decline were reduced when proxy-reports were used. Engaging in vigorous physical activity consistently and substantially protected against functional decline, whereas obesity, cigarette smoking, and alcohol consumption were only associated with mobility declines. Post-baseline hospitalizations were the most robust predictors of functional decline, exhibiting a dose-response effect such that the greater the average annual number of hospital episodes, the greater the likelihood of functional status decline. Participants whose final interview preceded their death by one year or less had substantially greater odds of functional status decline.</p> <p>Conclusions</p> <p>Both the additive and interactive (with functional status) effects of respondent status should be taken into consideration whenever proxy-reports are used. Encouraging exercise could broadly reduce the risk of functional decline across all three outcomes, although interventions encouraging weight reduction and smoking cessation would only affect mobility declines. Reducing hospitalization and re-hospitalization rates could also broadly reduce the risk of functional decline across all three outcomes.</p

    Clinical course, costs and predictive factors for response to treatment in carpal tunnel syndrome: The PALMS study protocol

    Get PDF
    Background Carpal tunnel syndrome (CTS) is the most common neuropathy of the upper limb and a significant contributor to hand functional impairment and disability. Effective treatment options include conservative and surgical interventions, however it is not possible at present to predict the outcome of treatment. The primary aim of this study is to identify which baseline clinical factors predict a good outcome from conservative treatment (by injection) or surgery in patients diagnosed with carpal tunnel syndrome. Secondary aims are to describe the clinical course and progression of CTS, and to describe and predict the UK cost of CTS to the individual, National Health Service (NHS) and society over a two year period. Methods/Design In this prospective observational cohort study patients presenting with clinical signs and symptoms typical of CTS and in whom the diagnosis is confirmed by nerve conduction studies are invited to participate. Data on putative predictive factors are collected at baseline and follow-up through patient questionnaires and include standardised measures of symptom severity, hand function, psychological and physical health, comorbidity and quality of life. Resource use and cost over the 2 year period such as prescribed medications, NHS and private healthcare contacts are also collected through patient self-report at 6, 12, 18 and 24 months. The primary outcome used to classify treatment success or failures will be a 5-point global assessment of change. Secondary outcomes include changes in clinical symptoms, functioning, psychological health, quality of life and resource use. A multivariable model of factors which predict outcome and cost will be developed. Discussion This prospective cohort study will provide important data on the clinical course and UK costs of CTS over a two-year period and begin to identify predictive factors for treatment success from conservative and surgical interventions

    Cavity formation on the surface of a body entering water with deceleration

    Get PDF
    The two-dimensional water entry of a rigid symmetric body with account for cavity formation on the body surface is studied. Initially the liquid is at rest and occupies the lower half plane. The rigid symmetric body touches the liquid free surface at a single point and then starts suddenly to penetrate the liquid vertically with a time-varying speed. We study the effect of the body deceleration on the pressure distribution in the flow region. It is shown that, in addition to the high pressures expected from the theory of impact, the pressure on the body surface can later decrease to sub-atmospheric levels. The creation of a cavity due to such low pressures is considered. The cavity starts at the lowest point of the body and spreads along the body surface forming a thin space between a new free surface and the body. Within the linearised hydrodynamic problem, the positions of the two turnover points at the periphery of the wetted area are determined by Wagner’s condition. The ends of the cavity’s free surface are modelled by the Brillouin–Villat condition. The pressure in the cavity is assumed to be a prescribed constant, which is a parameter of the model. The hydrodynamic problem is reduced to a system of integral and differential equations with respect to several functions of time. Results are presented for constant deceleration of two body shapes: a parabola and a wedge. The general formulation made also embraces conditions where the body is free to decelerate under the total fluid force. Contrasts are drawn between results from the present model and a simpler model in which the cavity formation is suppressed. It is shown that the expansion of the cavity can be significantly slower than the expansion of the corresponding zone of sub-atmospheric pressure in the simpler model. For forced motion and cavity pressure close to atmospheric, the cavity grows until almost complete detachment of the fluid from the body. In the problem of free motion of the body, cavitation with vapour pressure in the cavity is achievable only for extremely large impact velocities

    The Spread of Bluetongue Virus Serotype 8 in Great Britain and Its Control by Vaccination

    Get PDF
    Bluetongue (BT) is a viral disease of ruminants transmitted by Culicoides biting midges and has the ability to spread rapidly over large distances. In the summer of 2006, BTV serotype 8 (BTV-8) emerged for the first time in northern Europe, resulting in over 2000 infected farms by the end of the year. The virus subsequently overwintered and has since spread across much of Europe, causing tens of thousands of livestock deaths. In August 2007, BTV-8 reached Great Britain (GB), threatening the large and valuable livestock industry. A voluntary vaccination scheme was launched in GB in May 2008 and, in contrast with elsewhere in Europe, there were no reported cases in GB during 2008.Here, we use carefully parameterised mathematical models to investigate the spread of BTV in GB and its control by vaccination. In the absence of vaccination, the model predicted severe outbreaks of BTV, particularly for warmer temperatures. Vaccination was predicted to reduce the severity of epidemics, with the greatest reduction achieved for high levels (95%) of vaccine uptake. However, even at this level of uptake the model predicted some spread of BTV. The sensitivity of the predictions to vaccination parameters (time to full protection in cattle, vaccine efficacy), the shape of the transmission kernel and temperature dependence in the transmission of BTV between farms was assessed.A combination of lower temperatures and high levels of vaccine uptake (>80%) in the previously-affected areas are likely to be the major contributing factors in the control achieved in England in 2008. However, low levels of vaccination against BTV-8 or the introduction of other serotypes could result in further, potentially severe outbreaks in future
    corecore