653 research outputs found

    Quantifying measures to limit wind driven resuspension of sediments for improvement of the ecological quality in some shallow Dutch lakes

    Get PDF
    Although phosphorus loadings are considered the main pressure for most shallow lakes, wind-driven resuspension can cause additional problems for these aquatic ecosystems. We quantified the potential effectiveness of measures to reduce the contribution of resuspended sediments, resulting from wind action, to the overall light attenuation for three comparable shallow peat lakes with poor ecological status in the Netherlands: Loosdrecht, Nieuwkoop, and Reeuwijk (1.8–2.7 m depth, 1.6–2.5 km fetch). These measures are: 1. wave reducing barriers, 2. water level fluctuations, 3. capping of the sediment with sand, and 4. combinations of above. Critical shear stress of the sediments for resuspension (Vcrit), size distribution, and optical properties of the suspended material were quantified in the field (June 2009) and laboratory. Water quality monitoring data (2002–2009) showed that light attenuation by organic suspended matter in all lakes is high. Spatial modeling of the impact of these measures showed that in Lake Loosdrecht limiting wave action can have significant effects (reductions from 6% exceedance to 2% exceedance of Vcrit), whereas in Lake Nieuwkoop and Lake Reeuwijk this is less effective. The depth distribution and shape of Lake Nieuwkoop and Lake Reeuwijk limit the role of wind-driven resuspension in the total suspended matter concentration. Although the lakes are similar in general appearance (origin, size, and depth range) measures suitable to improve their ecological status differ. This calls for care when defining the programme of measures to improve the ecological status of a specific lake based on experience from other lakes.

    Quantitative data management in quality improvement collaboratives

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Collaborative approaches in quality improvement have been promoted since the introduction of the Breakthrough method. The effectiveness of this method is inconclusive and further independent evaluation of the method has been called for. For any evaluation to succeed, data collection on interventions performed within the collaborative and outcomes of those interventions is crucial. Getting enough data from Quality Improvement Collaboratives (QICs) for evaluation purposes, however, has proved to be difficult. This paper provides a retrospective analysis on the process of data management in a Dutch Quality Improvement Collaborative. From this analysis general failure and success factors are identified.</p> <p>Discussion</p> <p>This paper discusses complications and dilemma's observed in the set-up of data management for QICs. An overview is presented of signals that were picked up by the data management team. These signals were used to improve the strategies for data management during the program and have, as far as possible, been translated into practical solutions that have been successfully implemented.</p> <p>The recommendations coming from this study are:</p> <p>From our experience it is clear that quality improvement programs deviate from experimental research in many ways. It is not only impossible, but also undesirable to control processes and standardize data streams. QIC's need to be clear of data protocols that do not allow for change. It is therefore minimally important that when quantitative results are gathered, these results are accompanied by qualitative results that can be used to correctly interpret them.</p> <p>Monitoring and data acquisition interfere with routine. This makes a database collecting data in a QIC an intervention in itself. It is very important to be aware of this in reporting the results. Using existing databases when possible can overcome some of these problems but is often not possible given the change objective of QICs.</p> <p>Introducing a standardized spreadsheet to the teams is a very practical and helpful tool in collecting standardized data within a QIC. It is vital that the spreadsheets are handed out before baseline measurements start.</p

    A flexible framework for sparse simultaneous component based data integration

    Get PDF
    <p>Abstract</p> <p>1 Background</p> <p>High throughput data are complex and methods that reveal structure underlying the data are most useful. Principal component analysis, frequently implemented as a singular value decomposition, is a popular technique in this respect. Nowadays often the challenge is to reveal structure in several sources of information (e.g., transcriptomics, proteomics) that are available for the same biological entities under study. Simultaneous component methods are most promising in this respect. However, the interpretation of the principal and simultaneous components is often daunting because contributions of each of the biomolecules (transcripts, proteins) have to be taken into account.</p> <p>2 Results</p> <p>We propose a sparse simultaneous component method that makes many of the parameters redundant by shrinking them to zero. It includes principal component analysis, sparse principal component analysis, and ordinary simultaneous component analysis as special cases. Several penalties can be tuned that account in different ways for the block structure present in the integrated data. This yields known sparse approaches as the lasso, the ridge penalty, the elastic net, the group lasso, sparse group lasso, and elitist lasso. In addition, the algorithmic results can be easily transposed to the context of regression. Metabolomics data obtained with two measurement platforms for the same set of <it>Escherichia coli </it>samples are used to illustrate the proposed methodology and the properties of different penalties with respect to sparseness across and within data blocks.</p> <p>3 Conclusion</p> <p>Sparse simultaneous component analysis is a useful method for data integration: First, simultaneous analyses of multiple blocks offer advantages over sequential and separate analyses and second, interpretation of the results is highly facilitated by their sparseness. The approach offered is flexible and allows to take the block structure in different ways into account. As such, structures can be found that are exclusively tied to one data platform (group lasso approach) as well as structures that involve all data platforms (Elitist lasso approach).</p> <p>4 Availability</p> <p>The additional file contains a MATLAB implementation of the sparse simultaneous component method.</p

    Renal artery stenosis-when to screen, what to stent?

    Get PDF
    Renal artery stensosis (RAS) continues to be a problem for clinicians, with no clear consensus on how to investigate and assess the clinical significance of stenotic lesions and manage the findings. RAS caused by fibromuscular dysplasia is probably commoner than previously appreciated, should be actively looked for in younger hypertensive patients and can be managed successfully with angioplasty. Atheromatous RAS is associated with increased incidence of cardiovascular events and increased cardiovascular mortality, and is likely to be seen with increasing frequency. Evidence from large clinical trials has led clinicians away from recommending interventional revascularisation towards aggressive medical management. There is now interest in looking more closely at patient selection for intervention, with focus on intervening only in patients with the highest-risk presentations such as flash pulmonary oedema, rapidly declining renal function and severe resistant hypertension. The potential benefits in terms of improving hard cardiovascular outcomes may outweigh the risks of intervention in this group, and further research is needed

    A nationwide study on reproductive function, ovarian reserve, and risk of premature menopause in female survivors of childhood cancer: design and methodological challenges

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Advances in childhood cancer treatment over the past decades have significantly improved survival, resulting in a rapidly growing group of survivors. However, both chemo- and radiotherapy may adversely affect reproductive function. This paper describes the design and encountered methodological challenges of a nationwide study in the Netherlands investigating the effects of treatment on reproductive function, ovarian reserve, premature menopause and pregnancy outcomes in female childhood cancer survivors (CCS), the DCOG LATER-VEVO study.</p> <p>Methods</p> <p>The study is a retrospective cohort study consisting of two parts: a questionnaire assessing medical, menstrual, and obstetric history, and a clinical assessment evaluating ovarian and uterine function by hormonal analyses and transvaginal ultrasound measurements. The eligible study population consists of adult female 5-year survivors of childhood cancer treated in the Netherlands, whereas the control group consists of age-matched sisters of the participating CCS. To date, study invitations have been sent to 1611 CCS and 429 sister controls, of which 1215 (75%) and 333 (78%) have responded so far. Of these responders, the majority consented to participate in both parts of the study (53% vs. 65% for CCS and sister controls respectively). Several challenges were encountered involving the study population: dealing with bias due to the differences in characteristics of several types of (non-) participants and finding an adequately sized and well-matched control group. Moreover, the challenges related to the data collection process included: differences in response rates between web-based and paper-based questionnaires, validity of self-reported outcomes, interpretation of clinical measurements of women using hormonal contraceptives, and inter- and intra-observer variation of the ultrasound measurements.</p> <p>Discussion</p> <p>The DCOG LATER-VEVO study will provide valuable information about the reproductive potential of paediatric cancer patients as well as long-term survivors of childhood cancer. Other investigators planning to conduct large cohort studies on late effects may encounter similar challenges as those encountered during this study. The solutions to these challenges described in this paper may be useful to these investigators.</p> <p>Trial registration</p> <p>NTR2922; <url>http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2922</url></p

    Suppression of HBV by Tenofovir in HBV/HIV coinfected patients : a systematic review and meta-analysis

    Get PDF
    Background: Hepatitis B coinfection is common in HIV-positive individuals and as antiretroviral therapy has made death due to AIDS less common, hepatitis has become increasingly important. Several drugs are available to treat hepatitis B. The most potent and the one with the lowest risk of resistance appears to be tenofovir (TDF). However there are several questions that remain unanswered regarding the use of TDF, including the proportion of patients that achieves suppression of HBV viral load and over what time, whether suppression is durable and whether prior treatment with other HBV-active drugs such as lamivudine, compromises the efficacy of TDF due to possible selection of resistant HBV strains. Methods: A systematic review and meta-analysis following PRISMA guidelines and using multilevel mixed effects logistic regression, stratified by prior and/or concomitant use of lamivudine and/or emtricitabine. Results: Data was available from 23 studies including 550 HBV/HIV coinfected patients treated with TDF. Follow up was for up to seven years but to ensure sufficient power the data analyses were limited to three years. The overall proportion achieving suppression of HBV replication was 57.4%, 79.0% and 85.6% at one, two and three years, respectively. No effect of prior or concomitant 3TC/FTC was shown. Virological rebound on TDF treatment was rare. Interpretation: TDF suppresses HBV to undetectable levels in the majority of HBV/HIV coinfected patients with the proportion fully suppressed continuing to increase during continuous treatment. Prior treatment with 3TC/FTC does not compromise efficacy of TDF treatment. The use of combination treatment with 3TC/FTC offers no significant benefit over TDF alone

    The search for stable prognostic models in multiple imputed data sets

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In prognostic studies model instability and missing data can be troubling factors. Proposed methods for handling these situations are bootstrapping (B) and Multiple imputation (MI). The authors examined the influence of these methods on model composition.</p> <p>Methods</p> <p>Models were constructed using a cohort of 587 patients consulting between January 2001 and January 2003 with a shoulder problem in general practice in the Netherlands (the Dutch Shoulder Study). Outcome measures were persistent shoulder disability and persistent shoulder pain. Potential predictors included socio-demographic variables, characteristics of the pain problem, physical activity and psychosocial factors. Model composition and performance (calibration and discrimination) were assessed for models using a complete case analysis, MI, bootstrapping or both MI and bootstrapping.</p> <p>Results</p> <p>Results showed that model composition varied between models as a result of how missing data was handled and that bootstrapping provided additional information on the stability of the selected prognostic model.</p> <p>Conclusion</p> <p>In prognostic modeling missing data needs to be handled by MI and bootstrap model selection is advised in order to provide information on model stability.</p

    A prediction rule for shoulder pain related sick leave: a prospective cohort study

    Get PDF
    BACKGROUND: Shoulder pain is common in primary care, and has an unfavourable outcome in many patients. Information about predictors of shoulder pain related sick leave in workers is scarce and inconsistent. The objective was to develop a clinical prediction rule for calculating the risk of shoulder pain related sick leave for individual workers, during the 6 months following first consultation in general practice. METHODS: A prospective cohort study with 6 months follow-up was conducted among 350 workers with a new episode of shoulder pain. Potential predictors included the results of a physical examination, sociodemographic variables, disease characteristics (duration of symptoms, sick leave in the 2 months prior to consultation, pain intensity, disability, comorbidity), physical activity, physical work load, psychological factors, and the psychosocial work environment. The main outcome measure was sick leave during 6 months following first consultation in general practice. RESULTS: Response rate to the follow-up questionnaire at 6 months was 85%. During the 6 months after first consultation 30% (89/298) of the workers reported sick leave. 16% (47) reported 10 days sick leave or more. Sick leave during this period was predicted in a multivariable model by a longer duration of sick leave prior to consultation, more shoulder pain, a perceived cause of strain or overuse during regular activities, and co-existing psychological complaints. The discriminative ability of the prediction model was satisfactory with an area under the curve of 0.70 (95% CI 0.64–0.76). CONCLUSION: Although 30% of all workers with shoulder pain reported sick leave during follow-up, the duration of sick leave was limited to a few days in most workers. We developed a prediction rule and a score chart that can be used by general practitioners and occupational health care providers to calculate the absolute risk of sick leave in individual workers with shoulder pain, which may help to identify workers who need additional attention. The performance and applicability of our model needs to be tested in other working populations with shoulder pain to enable valid and reliable use of the score chart in everyday practice
    corecore