324 research outputs found

    Variable response of three Trifolium repens ecotypes to soil flooding by seawater.

    Get PDF
    BACKGROUND AND AIMS: Despite concerns about the impact of rising sea levels and storm surge events on coastal ecosystems, there is remarkably little information on the response of terrestrial coastal plant species to seawater inundation. The aim of this study was to elucidate responses of a glycophyte (white clover, Trifolium repens) to short-duration soil flooding by seawater and recovery following leaching of salts. METHODS: Using plants cultivated from parent ecotypes collected from a natural soil salinity gradient, the impact of short-duration seawater soil flooding (8 or 24 h) on short-term changes in leaf salt ion and organic solute concentrations was examined, together with longer term impacts on plant growth (stolon elongation) and flowering. KEY RESULTS: There was substantial Cl(-) and Na(+) accumulation in leaves, especially for plants subjected to 24 h soil flooding with seawater, but no consistent variation linked to parent plant provenance. Proline and sucrose concentrations also increased in plants following seawater flooding of the soil. Plant growth and flowering were reduced by longer soil immersion times (seawater flooding followed by drainage and freshwater inputs), but plants originating from more saline soil responded less negatively than those from lower salinity soil. CONCLUSIONS: The accumulation of proline and sucrose indicates a potential for solute accumulation as a response to the osmotic imbalance caused by salt ions, while variation in growth and flowering responses between ecotypes points to a natural adaptive capacity for tolerance of short-duration seawater soil flooding in T. repens. Consequently, it is suggested that selection for tolerant ecotypes is possible should the predicted increase in frequency of storm surge flooding events occur

    Assessing Risk of Future Suicidality in Emergency Department Patients

    Get PDF
    Background. Emergency Departments (ED) are the first line of evaluation for patients at risk and in crisis, with or without overt suicidality (ideation, attempts). Currently employed triage and assessments methods miss some of the individuals who subsequently become suicidal. The Convergent Functional Information for Suicidality (CFI-S) 22 item checklist of risk factors, that does not ask directly about suicidal ideation, has demonstrated good predictive ability for suicidality in previous studies in psychiatric outpatients, but has not been tested in the real world-setting of emergency departments (EDs). Methods. We administered CFI-S prospectively to a convenience sample of consecutive ED patients. Median administration time was 3 minutes. Patients were also asked at triage about suicidal thoughts or intentions per standard ED suicide clinical screening (SCS), and the treating ED physician was asked to fill a physician gestalt visual analog scale (VAS) for likelihood of future suicidality spectrum events (SSE) (ideation, preparatory acts, attempts, completed suicide). We performed structured chart review and telephone follow-up at 6 months post index visit. Results. The median time to complete the CFI-S was three minutes (1st to 3rd quartile 3–6 minutes). Of the 338 patients enrolled, 45 (13.3%) were positive on the initial SCS, and 32 (9.5%) experienced a SSE in the 6 months follow-up. Overall, across genders, SCS had a modest diagnostic discrimination for future SSE (ROC AUC 0.63,). The physician VAS was better (AUC 0.76 CI 0.66–0.85), and the CFI-S was slightly higher (AUC 0.81, CI 0.76–0.87). The top CFI-S differentiating items were psychiatric illness, perceived uselessness, and social isolation. The top CFI-S items were family history of suicide, age, and past history of suicidal acts. Conclusions. Using CFI-S, or some of its items, in busy EDs may help improve the detection of patients at high risk for future suicidality

    Three-dimensional culture of human meniscal cells: Extracellular matrix and proteoglycan production

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The meniscus is a complex tissue whose cell biology has only recently begun to be explored. Published models rely upon initial culture in the presence of added growth factors. The aim of this study was to test a three-dimensional (3D) collagen sponge microenvironment (without added growth factors) for its ability to provide a microenvironment supportive for meniscal cell extracellular matrix (ECM) production, and to test the responsiveness of cells cultured in this manner to transforming growth factor-β (TGF-β).</p> <p>Methods</p> <p>Experimental studies were approved prospectively by the authors' Human Subjects Institutional Review Board. Human meniscal cells were isolated from surgical specimens, established in monolayer culture, seeded into a 3D scaffold, and cell morphology and extracellular matrix components (ECM) evaluated either under control condition or with addition of TGF-β. Outcome variables were evaluation of cultured cell morphology, quantitative measurement of total sulfated proteoglycan production, and immunohistochemical study of the ECM components chondroitin sulfate, keratan sulfate, and types I and II collagen.</p> <p>Result and Conclusion</p> <p>Meniscal cells attached well within the 3D microenvironment and expanded with culture time. The 3D microenvironment was permissive for production of chondroitin sulfate, types I and II collagen, and to a lesser degree keratan sulfate. This microenvironment was also permissive for growth factor responsiveness, as indicated by a significant increase in proteoglycan production when cells were exposed to TGF-β (2.48 μg/ml ± 1.00, mean ± S.D., vs control levels of 1.58 ± 0.79, p < 0.0001). Knowledge of how culture microenvironments influence meniscal cell ECM production is important; the collagen sponge culture methodology provides a useful in vitro tool for study of meniscal cell biology.</p

    Modified bathroom scale and balance assessment: a comparison with clinical tests

    Get PDF
    Frailty and detection of fall risk are major issues in preventive gerontology. A simple tool frequently used in daily life, a bathroom scale (balance quality tester: BQT), was modified to obtain information on the balance of 84 outpatients consulting at a geriatric clinic. The results computed from the BQT were compared to the values of three geriatric tests that are widely used either to detect a fall risk or frailty (timed get up and go: TUG; 10 m walking speed: WS; walking time: WT; one-leg stand: OS). The BQT calculates four parameters that are then scored and weighted, thus creating an overall indicator of balance quality. Raw data, partial scores and the global score were compared with the results of the three geriatric tests. The WT values had the highest correlation with BQT raw data (r = 0.55), while TUG (r = 0.53) and WS (r = 0.56) had the highest correlation with BQT partial scores. ROC curves for OS cut-off values (4 and 5 s) were produced, with the best results obtained for a 5 s cut-off, both with the partial scores combined using Fisher's combination (specificity 85 %: 0.48), and with the empirical score (specificity 85 %: 8). A BQT empirical score of less than seven can detect fall risk in a community dwelling population

    The use of a Psoroptes ovis serodiagnostic test for the analysis of a natural outbreak of sheep scab

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Sheep scab is a highly contagious disease of sheep caused by the ectoparasitic mite <it>Psoroptes ovis</it>. The disease is endemic in the UK and has significant economic impact through its effects on performance and welfare. Diagnosis of sheep scab is achieved through observation of clinical signs e.g. itching, pruritis and wool loss and ultimately through the detection of mites in skin scrapings. Early stages of infestation are often difficult to diagnose and sub-clinical animals can be a major factor in disease spread. The development of a diagnostic assay would enable farmers and veterinarians to detect disease at an early stage, reducing the risk of developing clinical disease and limiting spread.</p> <p>Methods</p> <p>Serum samples were obtained from an outbreak of sheep scab within an experimental flock (n = 480 (3 samples each from 160 sheep)) allowing the assessment, by ELISA of sheep scab specific antibody prior to infestation, mid-outbreak (combined with clinical assessment) and post-treatment.</p> <p>Results</p> <p>Analysis of pre-infestation samples demonstrated low levels of potential false positives (3.8%). Of the 27 animals with clinical or behavioural signs of disease 25 tested positive at the mid-outbreak sampling period, however, the remaining 2 sheep tested positive at the subsequent sampling period. Clinical assessment revealed the absence of clinical or behavioural signs of disease in 132 sheep, whilst analysis of mid-outbreak samples showed that 105 of these clinically negative animals were serologically positive, representing potential sub-clinical infestations.</p> <p>Conclusions</p> <p>This study demonstrates that this ELISA test can effectively diagnose sheep scab in a natural outbreak of disease, and more importantly, highlights its ability to detect sub-clinically infested animals. This ELISA, employing a single recombinant antigen, represents a major step forward in the diagnosis of sheep scab and may prove to be critical in any future control program.</p

    Adoption incentives and environmental policy timing under asymmetric information and strategic firm behaviour

    Get PDF
    We consider the incentives of a single firm to invest in a cleaner technology under emission quotas and emission taxation. We assume asymmetric information about the firm's cost of employing the new technology. Policy is set either before the firm invests (commitment) or after (time consistency). Contrary to conventional wisdom, we find that with commitment (time consistency), quotas give higher (lower) investment incentives than taxes. With quotas (taxes), commitment generally leads to higher (lower) welfare than time consistency. Under commitment with quadratic abatement costs and environmental damages, a modified Weitzman rule applies and quotas usually lead to higher welfare than taxes

    Texture analysis-and support vector machine-assisted diffusional kurtosis imaging may allow in vivo gliomas grading and IDH-mutation status prediction:a preliminary study

    Get PDF
    We sought to investigate, whether texture analysis of diffusional kurtosis imaging (DKI) enhanced by support vector machine (SVM) analysis may provide biomarkers for gliomas staging and detection of the IDH mutation. First-order statistics and texture feature extraction were performed in 37 patients on both conventional (FLAIR) and mean diffusional kurtosis (MDK) images and recursive feature elimination (RFE) methodology based on SVM was employed to select the most discriminative diagnostic biomarkers. The first-order statistics demonstrated significantly lower MDK values in the IDH-mutant tumors. This resulted in 81.1% accuracy (sensitivity = 0.96, specificity = 0.45, AUC 0.59) for IDH mutation diagnosis. There were non-significant differences in average MDK and skewness among the different tumour grades. When texture analysis and SVM were utilized, the grading accuracy achieved by DKI biomarkers was 78.1% (sensitivity 0.77, specificity 0.79, AUC 0.79); the prediction accuracy for IDH mutation reached 83.8% (sensitivity 0.96, specificity 0.55, AUC 0.87). For the IDH mutation task, DKI outperformed significantly the FLAIR imaging. When using selected biomarkers after RFE, the prediction accuracy achieved 83.8% (sensitivity 0.92, specificity 0.64, AUC 0.88). These findings demonstrate the superiority of DKI enhanced by texture analysis and SVM, compared to conventional imaging, for gliomas staging and prediction of IDH mutational status

    Reproducibility and responsiveness of the Symptom Severity Scale and the hand and finger function subscale of the Dutch arthritis impact measurement scales (Dutch-AIMS2-HFF) in primary care patients with wrist or hand problems

    Get PDF
    BACKGROUND: To determine the clinimetric properties of two questionnaires assessing symptoms (Symptom Severity Scale) and physical functioning (hand and finger function subscale of the AIMS2) in a Dutch primary care population. METHODS: The first 84 participants in a 1-year follow-up study on the diagnosis and prognosis of hand and wrist problems completed the Symptom Severity Scale and the hand and finger function subscale of the Dutch-AIMS2 twice within 1 to 2 weeks. The data were used to assess test-retest reliability (ICC) and smallest detectable change (SDC, based on the standard error of measurement (SEM)). To assess responsiveness, changes in scores between baseline and the 3 month follow-up were related to an external criterion to estimate the minimal important change (MIC). We calculated the group size needed to detect the MIC beyond measurement error. RESULTS: The ICC for the Symptom Severity Scale was 0.68 (95% CI: 0.54–0.78). The SDC was 1.00 at individual level and 0.11 at group level, both on a 5-point scale. The MIC was 0.23, exceeding the SDC at group level. The group size required to detect a MIC beyond measurement error was 19 for the Symptom Severity Scale. The ICC for the hand and finger function subscale of the Dutch-AIMS2 was 0.62 (95% CI: 0.47–0.74). The SDC was 3.80 at individual level and 0.42 at group level, both on an 11-point scale. The MIC was 0.31, which was less than the SDC at group level. The group size required to detect a MIC beyond measurement error was 150. CONCLUSION: In our heterogeneous primary care population the Symptom Severity Scale was found to be a suitable instrument to assess the severity of symptoms, whereas the hand and finger function subscale of the Dutch-AIMS2 was less suitable for the measurement of physical functioning in patients with hand and wrist problems

    Estimating Long-Term Survival of Critically Ill Patients: The PREDICT Model

    Get PDF
    BACKGROUND: Long-term survival outcome of critically ill patients is important in assessing effectiveness of new treatments and making treatment decisions. We developed a prognostic model for estimation of long-term survival of critically ill patients. METHODOLOGY AND PRINCIPAL FINDINGS: This was a retrospective linked data cohort study involving 11,930 critically ill patients who survived more than 5 days in a university teaching hospital in Western Australia. Older age, male gender, co-morbidities, severe acute illness as measured by Acute Physiology and Chronic Health Evaluation II predicted mortality, and more days of vasopressor or inotropic support, mechanical ventilation, and hemofiltration within the first 5 days of intensive care unit admission were associated with a worse long-term survival up to 15 years after the onset of critical illness. Among these seven pre-selected predictors, age (explained 50% of the variability of the model, hazard ratio [HR] between 80 and 60 years old = 1.95) and co-morbidity (explained 27% of the variability, HR between Charlson co-morbidity index 5 and 0 = 2.15) were the most important determinants. A nomogram based on the pre-selected predictors is provided to allow estimation of the median survival time and also the 1-year, 3-year, 5-year, 10-year, and 15-year survival probabilities for a patient. The discrimination (adjusted c-index = 0.757, 95% confidence interval 0.745-0.769) and calibration of this prognostic model were acceptable. SIGNIFICANCE: Age, gender, co-morbidities, severity of acute illness, and the intensity and duration of intensive care therapy can be used to estimate long-term survival of critically ill patients. Age and co-morbidity are the most important determinants of long-term prognosis of critically ill patients
    • …
    corecore