409 research outputs found

    Early intravenous unfractionated heparin and outcome in acute lung injury and acute respiratory distress syndrome: a retrospective propensity matched cohort study.

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are.BACKGROUND: Acute lung injury (ALI) is characterized by a pro-coagulant state. Heparin is an anticoagulant with anti-inflammatory properties. Unfractionated heparin has been found to be protective in experimental models of ALI. We hypothesized that an intravenous therapeutic dose of unfractionated heparin would favorably influence outcome of critically ill patients diagnosed with ALI. METHODS: Patients admitted to the Intensive Care Unit (ICU) of a tertiary referral center in the Netherlands between November 2004 and October 2007 were screened. Patients who developed ALI (consensus definition) were included. In this cohort, the impact of heparin use on mortality was assessed by logistic regression analysis in a propensity matched case-control design. RESULTS: Of 5,561 admitted patients, 2,138 patients had a length of stay > 48 hours, of whom 723 were diagnosed with ALI (34%), of whom 164 received intravenous heparin. In a propensity score adjusted logistic regression analysis, heparin use did not influence 28-day mortality (odds ratio 1.23 [confidence interval 95% 0.80-1.89], nor did it affect ICU length of stay. CONCLUSIONS: Administration of therapeutic doses of intravenous unfractionated heparin was not associated with reduced mortality in critically ill patients diagnosed with ALI. Heparin treatment did not increase transfusion requirements. These results may help in the design of prospective trials evaluating the use of heparin as adjunctive treatment for ALI

    Study protocol of a randomized controlled trial comparing Mindfulness-Based Stress Reduction with treatment as usual in reducing psychological distress in patients with lung cancer and their partners: the MILON study

    Get PDF
    BACKGROUND: Lung cancer is the leading cause of cancer death worldwide and characterized by a poor prognosis. It has a major impact on the psychological wellbeing of patients and their partners. Recently, it has been shown that Mindfulness-Based Stress Reduction (MBSR) is effective in reducing anxiety and depressive symptoms in cancer patients. The generalization of these results is limited since most participants were female patients with breast cancer. Moreover, only one study examined the effectiveness of MBSR in partners of cancer patients. Therefore, in the present trial we study the effectiveness of MBSR versus treatment as usual (TAU) in patients with lung cancer and their partners. METHODS/DESIGN: A parallel group, randomized controlled trial is conducted to compare MBSR with TAU. Lung cancer patients who have received or are still under treatment, and their partners are recruited. Assessments will take place at baseline, post intervention and at three-month follow-up. The primary outcome is psychological distress (i.e. anxiety and depressive symptoms). Secondary outcomes are quality of life (only for patients), caregiver appraisal (only for partners), relationship quality and spirituality. In addition, cost-effectiveness ratio (only in patients) and several process variables are assessed. DISCUSSION: This trial will provide information about the clinical and cost-effectiveness of MBSR compared to TAU in patients with lung cancer and their partners. TRIAL REGISTRATION: ClinicalTrials.gov NCT01494883

    Impacts of savanna trees on forage quality for a large African herbivore

    Get PDF
    Recently, cover of large trees in African savannas has rapidly declined due to elephant pressure, frequent fires and charcoal production. The reduction in large trees could have consequences for large herbivores through a change in forage quality. In Tarangire National Park, in Northern Tanzania, we studied the impact of large savanna trees on forage quality for wildebeest by collecting samples of dominant grass species in open grassland and under and around large Acacia tortilis trees. Grasses growing under trees had a much higher forage quality than grasses from the open field indicated by a more favourable leaf/stem ratio and higher protein and lower fibre concentrations. Analysing the grass leaf data with a linear programming model indicated that large savanna trees could be essential for the survival of wildebeest, the dominant herbivore in Tarangire. Due to the high fibre content and low nutrient and protein concentrations of grasses from the open field, maximum fibre intake is reached before nutrient requirements are satisfied. All requirements can only be satisfied by combining forage from open grassland with either forage from under or around tree canopies. Forage quality was also higher around dead trees than in the open field. So forage quality does not reduce immediately after trees die which explains why negative effects of reduced tree numbers probably go initially unnoticed. In conclusion our results suggest that continued destruction of large trees could affect future numbers of large herbivores in African savannas and better protection of large trees is probably necessary to sustain high animal densities in these ecosystems

    Frequent burning promotes invasions of alien plants into a mesic African savanna

    Get PDF
    Fire is both inevitable and necessary for maintaining the structure and functioning of mesic savannas. Without disturbances such as fire and herbivory, tree cover can increase at the expense of grass cover and over time dominate mesic savannas. Consequently, repeated burning is widely used to suppress tree recruitment and control bush encroachment. However, the effect of regular burning on invasion by alien plant species is little understood. Here, vegetation data from a long-term fire experiment, which began in 1953 in a mesic Zimbabwean savanna, were used to test whether the frequency of burning promoted alien plant invasion. The fire treatments consisted of late season fires, lit at 1-, 2-, 3-, and 4-year intervals, and these regularly burnt plots were compared with unburnt plots. Results show that over half a century of frequent burning promoted the invasion by alien plants relative to areas where fire was excluded. More alien plant species became established in plots that had a higher frequency of burning. The proportion of alien species in the species assemblage was highest in the annually burnt plots followed by plots burnt biennially. Alien plant invasion was lowest in plots protected from fire but did not differ significantly between plots burnt triennially and quadrennially. Further, the abundance of five alien forbs increased significantly as the interval (in years) between fires became shorter. On average, the density of these alien forbs in annually burnt plots was at least ten times as high as the density of unburnt plots. Plant diversity was also altered by long-term burning. Total plant species richness was significantly lower in the unburnt plots compared to regularly burnt plots. These findings suggest that frequent burning of mesic savannas enhances invasion by alien plants, with short intervals between fires favouring alien forbs. Therefore, reducing the frequency of burning may be a key to minimising the risk of alien plant spread into mesic savannas, which is important because invasive plants pose a threat to native biodiversity and may alter savanna functioning

    ATN classification and clinical progression in subjective cognitive decline

    Get PDF
    Objective: To investigate the relationship between the ATN classification system (amyloid, tau, neurodegeneration) and risk of dementia and cognitive decline in individuals with subjective cognitive decline (SCD). / Methods: We classified 693 participants with SCD (60 ± 9 years, 41% women, Mini-Mental State Examination score 28 ± 2) from the Amsterdam Dementia Cohort and Subjective Cognitive Impairment Cohort (SCIENCe) project according to the ATN model, as determined by amyloid PET or CSF β-amyloid (A), CSF p-tau (T), and MRI-based medial temporal lobe atrophy (N). All underwent extensive neuropsychological assessment. For 342 participants, follow-up was available (3 ± 2 years). As a control population, we included 124 participants without SCD. / Results: Fifty-six (n = 385) participants had normal Alzheimer disease (AD) biomarkers (A–T–N–), 27% (n = 186) had non-AD pathologic change (A–T–N+, A–T+N–, A–T+N+), 18% (n = 122) fell within the Alzheimer continuum (A+T–N–, A+T–N+, A+T+N–, A+T+N+). ATN profiles were unevenly distributed, with A–T+N+, A+T–N+, and A+T+N+ containing very few participants. Cox regression showed that compared to A–T–N–, participants in A+ profiles had a higher risk of dementia with a dose–response pattern for number of biomarkers affected. Linear mixed models showed participants in A+ profiles showed a steeper decline on tests addressing memory, attention, language, and executive functions. In the control group, there was no association between ATN and cognition. / Conclusions: Among individuals presenting with SCD at a memory clinic, those with a biomarker profile A–T+N+, A+T–N–, A+T+N–, and A+T+N+ were at increased risk of dementia, and showed steeper cognitive decline compared to A–T–N– individuals. These results suggest a future where biomarker results could be used for individualized risk profiling in cognitively normal individuals presenting at a memory clinic

    Perceived need for mental health care and barriers to care in the Netherlands and Australia

    Get PDF
    This study of Australian and Dutch people with anxiety or depressive disorder aims to examine people's perceived needs and barriers to care, and to identify possible similarities and differences. Data from the Australian National Survey of Mental Health and Well-Being and the Netherlands Study of Depression and Anxiety were combined into one data set. The Perceived Need for Care Questionnaire was taken in both studies. Logistic regression analyses were performed to check if similarities or differences between Australia and the Netherlands could be observed. In both countries, a large proportion had unfulfilled needs and self-reliance was the most frequently named barrier to receive care. People from the Australian sample (N = 372) were more likely to perceive a need for medication (OR 1.8; 95% CI 1.3-2.5), counselling (OR 1.4; 95% CI 1.0-2.0) and practical support (OR 1.8; 95% CI 1.2-2.7), and people's overall needs in Australia were more often fully met compared with those of the Dutch sample (N = 610). Australians were more often pessimistic about the helpfulness of medication (OR 3.8; 95% CI 1.4-10.7) and skills training (OR 3.0; 95% CI 1.1-8.2) and reported more often financial barriers for not having received (enough) information (OR 2.4; 95% CI 1.1-5.5) or counselling (OR 5.9; 95% CI 2.9-11.9). In both countries, the vast majority of mental health care needs are not fulfilled. Solutions could be found in improving professionals' skills or better collaboration. Possible explanations for the found differences in perceived need and barriers to care are discussed; these illustrate the value of examining perceived need across nations and suggest substantial commonalities of experience across the two countries

    Increased searching and handling effort in tall swards lead to a Type IV functional response in small grazing herbivores

    Get PDF
    Understanding the functional response of species is important in comprehending the species’ population dynamics and the functioning of multi-species assemblages. A Type II functional response, where instantaneous intake rate increases asymptotically with sward biomass, is thought to be common in grazers. However, at tall, dense swards, food intake might decline due to mechanical limitations or if animals selectively forage on the most nutritious parts of a sward, leading to a Type IV functional response, especially for smaller herbivores. We tested the predictions that bite mass, cropping time, swallowing time and searching time increase, and bite rate decreases with increasing grass biomass for different-sized Canada geese (Branta canadensis) foraging on grass swards. Bite mass indeed showed an increasing asymptotic relationship with grass biomass. At high biomass, difficulties in handling long leaves and in locating bites were responsible for increasing cropping, swallowing, and searching times. Constant bite mass and decreasing bite rate caused the intake rate to decrease at high sward biomass after reaching an optimum, leading to a Type IV functional response. Grazer body mass affected maximum bite mass and intake rate, but did not change the shape of the functional response. As grass nutrient contents are usually highest in short swards, this Type IV functional response in geese leads to an intake rate that is maximised in these swards. The lower grass biomass at which intake rate was maximised allows resource partitioning between different-sized grazers. We argue that this Type IV functional response is of more importance than previously thought

    Estimating radiation effective doses from whole body computed tomography scans based on U.S. soldier patient height and weight

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The purpose of this study is to explore how a patient's height and weight can be used to predict the effective dose to a reference phantom with similar height and weight from a chest abdomen pelvis computed tomography scan when machine-based parameters are unknown. Since machine-based scanning parameters can be misplaced or lost, a predictive model will enable the medical professional to quantify a patient's cumulative radiation dose.</p> <p>Methods</p> <p>One hundred mathematical phantoms of varying heights and weights were defined within an x-ray Monte Carlo based software code in order to calculate organ absorbed doses and effective doses from a chest abdomen pelvis scan. Regression analysis was used to develop an effective dose predictive model. The regression model was experimentally verified using anthropomorphic phantoms and validated against a real patient population.</p> <p>Results</p> <p>Estimates of the effective doses as calculated by the predictive model were within 10% of the estimates of the effective doses using experimentally measured absorbed doses within the anthropomorphic phantoms. Comparisons of the patient population effective doses show that the predictive model is within 33% of current methods of estimating effective dose using machine-based parameters.</p> <p>Conclusions</p> <p>A patient's height and weight can be used to estimate the effective dose from a chest abdomen pelvis computed tomography scan. The presented predictive model can be used interchangeably with current effective dose estimating techniques that rely on computed tomography machine-based techniques.</p

    The ANKLE TRIAL (ANKLE treatment after injuries of the ankle ligaments): what is the benefit of external support devices in the functional treatment of acute ankle sprain? : a randomised controlled trial

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Acute lateral ankle ligament injuries are very common problems in present health care. Still there is no hard evidence about which treatment strategy is superior. Current evidence supports the view that a functional treatment strategy is preferable, but insufficient data are present to prove the benefit of external support devices in these types of treatment. The hypothesis of our study is that external ankle support devices will not result in better outcome in the treatment of acute ankle sprains, compared to a purely functional treatment strategy. Overall objective is to compare the results of three different strategies of functional treatment for acute ankle sprain, especially to determine the advantages of external support devices in addition to functional treatment strategy, based on balance and coordination exercises.</p> <p>Methods/design</p> <p>This study is designed as a randomised controlled multi-centre trial with one-year follow-up. Adult and healthy patients (N = 180) with acute, single sided and first inversion trauma of the lateral ankle ligaments will be included. They will all follow the same schedule of balancing exercises and will be divided into 3 treatment groups, 1. pressure bandage and tape, 2. pressure bandage and brace and 3. no external support. Primary outcome measure is the Karlsson scoring scale; secondary outcomes are FAOS (subscales), number of recurrent ankle injuries, Visual Analogue Scales of pain and satisfaction and adverse events. They will be measured after one week, 6 weeks, 6 months and 1 year.</p> <p>Discussion</p> <p>The ANKLE TRIAL is a randomized controlled trial in which a purely functional treated control group, without any external support is investigated. Results of this study could lead to other opinions about usefulness of external support devices in the treatment of acute ankle sprain.</p> <p>Trial registration</p> <p>Netherlands Trial Register (NTR): <a href="http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2151">NTR2151</a></p

    Lower Blood Calcium Associates with Unfavorable Prognosis and Predicts for Bone Metastasis in NSCLC

    Get PDF
    Ionized calcium was involved in various cellular signal pathways,and regulates many cellular processes, including those relevant to tumorigenesis. We hypothesis that imbalance of calcium homeostasis is correlated with development of lung carcinomas. We collected the clinical data of 1084 patients with non small cell lung cancer (NSCLC) treated in Shandong Provincial Hospital, Shandong University. Logistic regression was used to determine the association between calcium levels and clinical characteristics, and COX regression and Kaplan-Meier model were applied to analyze risk factors on overall survival. Blood electrolytes were tested before treatment; and nearly 16% patients with NSCLC were complained with decreased blood calcium, which is more frequent than that in other electrolytes. Further, Multivariate logistic regression analysis disclosed that there were significant correlation between blood calcium decrease and moderate and poor differentiation (P = 0.012, OR = 1.926 (1.203–4.219)), squamous cell carcinoma (P = 0.024, OR = 1.968(1.094–3.540)), and bone metastasis (P = 0.032, OR = 0.396(0.235–0.669)). In multivariate COX regression analysis, advanced lymph node stage and decreased blood calcium were significantly and independent, unfavorable prognostic factors (P<0.001). Finally, the Kaplan-Meier Survival curve revealed that blood calcium decrease was associated with shorter survival (Log-rank; χ2 = 26.172,P<0.001). Our finding indicates that lower blood calcium levels are associated with a higher risk of unfavorable prognosis and bone metastasis of NSCLC
    corecore