123 research outputs found

    Cardiac troponin I levels in canine pyometra

    Get PDF
    BACKGROUND: Myocardial injury may contribute to unexpected deaths due to pyometra. To detect myocardial damage, measurement of cardiac troponin I (cTnI) is currently the most sensitive and specific method. The aims of the present study were to evaluate presence of myocardial damage in canine pyometra by analysis of cTnI, to explore whether myocardial injury was associated with systemic inflammatory response syndrome (SIRS) and to evaluate whether other clinical or laboratory parameters were associated with cTnI increase. METHODS: Preoperative plasma levels of cTnI were investigated in 58 female dogs with pyometra and 9 controls. The value of physical examination findings, haematological, serum biochemical and pro-inflammatory (CRP and TNF-α) parameters as possible predictors of increased cTnI levels was also evaluated. RESULTS: Seven dogs with pyometra (12%) and one control dog (11%) had increased levels of cTnI. In the pyometra group, the levels ranged between 0.3–0.9 μg l(-1 )and in the control dog the level was 0.3 μg l(-1). The cTnI levels did not differ significantly between the two groups. No cardiac abnormalities were evident on preoperative physical examinations. Four of the pyometra patients died within two weeks of surgery, of which two were examined post mortem. In one of these cases (later diagnosed with myocarditis and disseminated bacterial infection) the cTnI levels increased from 0.9 μg l(-1 )preoperatively to 180 μg l(-1 )the following day when also heart arrhythmia was also detected. The other patient had cTnI levels of 0.7 μg l(-1 )with no detectable heart pathology post mortem. CTnI increase was not associated with presence of SIRS. There was a trend for the association of cTnI increase with increased mortality. No preoperative physical examination findings and few but unspecific laboratory parameters were associated with increased cTnI levels. CONCLUSION: Increased cTnI levels were observed in 12% of the dogs with pyometra. The proportions of dogs with cTnI increase did not differ significantly in the pyometra group compared with the control group. CTnI increase was not associated with presence of SIRS. A trend for association of cTnI increase and mortality was observed. Preoperative physical examination findings and included laboratory parameters were poor predictors of increased cTnI levels

    Thyroid function tests in patients taking thyroid medication in Germany: Results from the population-based Study of Health in Pomerania (SHIP)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Studies from iodine-sufficient areas have shown that a high proportion of patients taking medication for thyroid diseases have thyroid stimulating hormone (TSH) levels outside the reference range. Next to patient compliance, inadequate dosing adjustment resulting in under- and over-treatment of thyroid disease is a major cause of poor therapy outcomes. Using thyroid function tests, we aim to measure the proportions of subjects, who are under- or over-treated with thyroid medication in a previously iodine-deficient area.</p> <p>Findings</p> <p>Data from 266 subjects participating in the population-based Study of Health in Pomerania (SHIP) were analysed. All subjects were taking thyroid medication. Serum TSH levels were measured using immunochemiluminescent procedures. TSH levels of < 0.27 or > 2.15 mIU/L in subjects younger than 50 years and < 0.19 or > 2.09 mIU/L in subjects 50 years and older, were defined as decreased or elevated, according to the established reference range for the specific study area. Our analysis revealed that 56 of 190 (29.5%) subjects treated with thyroxine had TSH levels outside the reference range (10.0% elevated, 19.5% decreased). Of the 31 subjects taking antithyroid drugs, 12 (38.7%) had TSH levels outside the reference range (9.7% elevated, 29.0% decreased). These proportions were lower in the 45 subjects receiving iodine supplementation (2.2% elevated, 8.9% decreased). Among the 3,974 SHIP participants not taking thyroid medication, TSH levels outside the reference range (2.8% elevated, 5.9% decreased) were less frequent.</p> <p>Conclusion</p> <p>In concordance with previous studies in iodine-sufficient areas, our results indicate that a considerable number of patients taking thyroid medication are either under- or over-treated. Improved monitoring of these patients' TSH levels, compared to the local reference range, is recommended.</p

    Comprehensive Analysis of 5-Aminolevulinic Acid Dehydrogenase (ALAD) Variants and Renal Cell Carcinoma Risk among Individuals Exposed to Lead

    Get PDF
    BACKGROUND: Epidemiologic studies are reporting associations between lead exposure and human cancers. A polymorphism in the 5-aminolevulinic acid dehydratase (ALAD) gene affects lead toxicokinetics and may modify the adverse effects of lead. METHODS: The objective of this study was to evaluate single-nucleotide polymorphisms (SNPs) tagging the ALAD region among renal cancer cases and controls to determine whether genetic variation alters the relationship between lead and renal cancer. Occupational exposure to lead and risk of cancer was examined in a case-control study of renal cell carcinoma (RCC). Comprehensive analysis of variation across the ALAD gene was assessed using a tagging SNP approach among 987 cases and 1298 controls. Occupational lead exposure was estimated using questionnaire-based exposure assessment and expert review. Odds ratios (OR) and 95% confidence intervals (CI) were calculated using logistic regression. RESULTS: The adjusted risk associated with the ALAD variant rs8177796(CT/TT) was increased (OR = 1.35, 95%CI = 1.05-1.73, p-value = 0.02) when compared to the major allele, regardless of lead exposure. Joint effects of lead and ALAD rs2761016 suggest an increased RCC risk for the homozygous wild-type and heterozygous alleles ((GG)OR = 2.68, 95%CI = 1.17-6.12, p = 0.01; (GA)OR = 1.79, 95%CI = 1.06-3.04 with an interaction approaching significance (p(int) = 0.06). No significant modification in RCC risk was observed for the functional variant rs1800435(K68N). Haplotype analysis identified a region associated with risk supporting tagging SNP results. CONCLUSION: A common genetic variation in ALAD may alter the risk of RCC overall, and among individuals occupationally exposed to lead. Further work in larger exposed populations is warranted to determine if ALAD modifies RCC risk associated with lead exposure

    Non-Invasive Imaging of Acute Renal Allograft Rejection in Rats Using Small Animal 18F-FDG-PET

    Get PDF
    BACKGROUND: At present, renal grafts are the most common solid organ transplants world-wide. Given the importance of renal transplantation and the limitation of available donor kidneys, detailed analysis of factors that affect transplant survival are important. Despite the introduction of new and effective immunosuppressive drugs, acute cellular graft rejection (AR) is still a major risk for graft survival. Nowadays, AR can only be definitively by renal biopsy. However, biopsies carry a risk of renal transplant injury and loss. Most important, they can not be performed in patients taking anticoagulant drugs. METHODOLOGY/PRINCIPAL FINDINGS: We present a non-invasive, entirely image-based method to assess AR in an allogeneic rat renal transplantation model using small animal positron emission tomography (PET) and (18)F-fluorodeoxyglucose (FDG). 3 h after i.v. injection of 30 MBq FDG into adult uni-nephrectomized, allogeneically transplanted rats, tissue radioactivity of renal parenchyma was assessed in vivo by a small animal PET-scanner (post operative day (POD) 1,2,4, and 7) and post mortem dissection. The mean radioactivity (cps/mm(3) tissue) as well as the percent injected dose (%ID) was compared between graft and native reference kidney. Results were confirmed by histological and autoradiographic analysis. Healthy rats, rats with acute CSA nephrotoxicity, with acute tubular necrosis, and syngeneically transplanted rats served as controls. FDG-uptake was significantly elevated only in allogeneic grafts from POD 1 on when compared to the native kidney (%ID graft POD 1: 0.54+/-0.06; POD 2: 0.58+/-0.12; POD 4: 0.81+/-0.06; POD 7: 0.77+/-0.1; CTR: 0.22+/-0.01, n = 3-28). Renal FDG-uptake in vivo correlated with the results obtained by micro-autoradiography and the degree of inflammatory infiltrates observed in histology. CONCLUSIONS/SIGNIFICANCE: We propose that graft FDG-PET imaging is a new option to non-invasively, specifically, early detect, and follow-up acute renal rejection. This method is potentially useful to improve post-transplant rejection monitoring

    Stretch-Induced Stress Fiber Remodeling and the Activations of JNK and ERK Depend on Mechanical Strain Rate, but Not FAK

    Get PDF
    BACKGROUND: Cells within tissues are subjected to mechanical forces caused by extracellular matrix deformation. Cells sense and dynamically respond to stretching of the matrix by reorienting their actin stress fibers and by activating intracellular signaling proteins, including focal adhesion kinase (FAK) and the mitogen-activated proteins kinases (MAPKs). Theoretical analyses predict that stress fibers can relax perturbations in tension depending on the rate of matrix strain. Thus, we hypothesized stress fiber organization and MAPK activities are altered to an extent dependent on stretch frequency. PRINCIPAL FINDINGS: Bovine aortic endothelial cells and human osteosarcoma cells expressing GFP-actin were cultured on elastic membranes and subjected to various patterns of stretch. Cyclic stretching resulted in strain rate-dependent increases in stress fiber alignment, cell retraction, and the phosphorylation of the MAPKs JNK, ERK and p38. Transient step changes in strain rate caused proportional transient changes in the levels of JNK and ERK phosphorylations without affecting stress fiber organization. Disrupting stress fiber contractile function with cytochalasin D or Y27632 decreased the levels of JNK and ERK phosphorylation. Previous studies indicate that FAK is required for stretch-induced cell alignment and MAPK activations. However, cyclic uniaxial stretching induced stress fiber alignment and the phosphorylation of JNK, ERK and p38 to comparable levels in FAK-null and FAK-expressing mouse embryonic fibroblasts. CONCLUSIONS: These results indicate that cyclic stretch-induced stress fiber alignment, cell retraction, and MAPK activations occur as a consequence of perturbations in fiber strain. These findings thus shed new light into the roles of stress fiber relaxation and reorganization in maintenance of tensional homeostasis in a dynamic mechanical environment

    Whole blood lead levels are associated with radiographic and symptomatic knee osteoarthritis: a cross-sectional analysis in the Johnston County Osteoarthritis Project

    Get PDF
    Abstract Introduction Lead (Pb) is known to affect bone, and recent evidence suggests that it has effects on cartilage as well. As osteoarthritis (OA) is a highly prevalent disease affecting bone and cartilage, we undertook the present analysis to determine whether whole blood Pb levels are associated with radiographic and symptomatic OA (rOA and sxOA, respectively) of the knee. Methods The analysis was conducted using cross-sectional data from the Johnston County Osteoarthritis Project, a rural, population-based study, including whole blood Pb levels, bilateral posteroanterior weight-bearing knee radiography and knee symptom data. rOA assessment included joint-based presence (Kellgren-Lawrence (K-L) grade 2 or higher) and severity (none, K-L grade 0 or 1; mild, K-L grade 2; moderate or severe, K-L grade 3 or 4), as well as person-based laterality (unilateral or bilateral). SxOA was deemed present (joint-based) in a knee on the basis of K-L grade 2 or higher with symptoms, with symptoms rated based on severity (0, rOA without symptoms; 1, rOA with mild symptoms; 2, rOA with moderate or severe symptoms) and in person-based analyses was either unilateral or bilateral. Generalized logit or proportional odds regression models were used to examine associations between the knee OA status variables and natural log-transformed blood Pb (ln Pb), continuously and in quartiles, controlling for age, race, sex, body mass index (BMI), smoking and alcohol drinking. Results Those individuals with whole blood Pb data (N = 1,669) had a mean (±SD) age of 65.4 (±11.0) years and a mean BMI of 31.2 (±7.1) kg/m2, including 66.6% women and 35.4% African-Americans, with a median blood Pb level of 1.8 μg/dl (range, 0.3 to 42.0 μg/dl). In joint-based analyses, for every 1-U increase in ln Pb, the odds of prevalent knee rOA were 20% higher (aOR, 1.20; 95% CI, 1.01 to 1.44), while the odds of more severe rOA were 26% higher (aOR, 1.26; 95% CI, 1.05 to 1.50, under proportional odds). In person-based analyses, the odds of bilateral rOA were 32% higher for each 1-U increase in ln Pb (aOR, 1.32; 95% CI, 1.03 to 1.70). Similarly for knee sxOA, for each 1-U increase in ln Pb, the odds of having sxOA were 16% higher, the odds of having more severe symptoms were 17% higher and the odds of having bilateral knee symptoms were 25% higher. Similar findings were obtained with regard to ln Pb in quartiles. Conclusions Increases in the prevalence and severity measures for both radiographically and symptomatically confirmed knee OA (although statistically significant only for rOA) were observed with increasing levels of blood Pb, suggesting that Pb may be a potentially modifiable environmental risk factor for OA

    Quantifying the effectiveness of climate change mitigation through forest plantations and carbon sequestration with an integrated land-use model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Carbon plantations are introduced in climate change policy as an option to slow the build-up of atmospheric carbon dioxide (CO<sub>2</sub>) concentrations. Here we present a methodology to evaluate the potential effectiveness of carbon plantations. The methodology explicitly considers future long-term land-use change around the world and all relevant carbon (C) fluxes, including all natural fluxes. Both issues have generally been ignored in earlier studies.</p> <p>Results</p> <p>Two different baseline scenarios up to 2100 indicate that uncertainties in future land-use change lead to a near 100% difference in estimates of carbon sequestration potentials. Moreover, social, economic and institutional barriers preventing carbon plantations in natural vegetation areas decrease the physical potential by 75–80% or more.</p> <p>Nevertheless, carbon plantations can still considerably contribute to slowing the increase in the atmospheric CO<sub>2 </sub>concentration but only in the long term. The most conservative set of assumptions lowers the increase of the atmospheric CO<sub>2 </sub>concentration in 2100 by a 27 ppm and compensates for 5–7% of the total energy-related CO<sub>2 </sub>emissions. The net sequestration up to 2020 is limited, given the short-term increased need for agricultural land in most regions and the long period needed to compensate for emissions through the establishment of the plantations. The potential is highest in the tropics, despite projections that most of the agricultural expansion will be in these regions. Plantations in high latitudes as Northern Europe and Northern Russia should only be established if the objective to sequester carbon is combined with other activities.</p> <p>Conclusion</p> <p>Carbon sequestration in plantations can play an important role in mitigating the build-up of atmospheric CO<sub>2</sub>. The actual magnitude depends on natural and management factors, social barriers, and the time frame considered. In addition, there are a number of ancillary benefits for local communities and the environment. Carbon plantations are, however, particularly effective in the long term. Furthermore, plantations do not offer the ultimate solution towards stabilizing CO<sub>2 </sub>concentrations but should be part of a broader package of options with clear energy emission reduction measures.</p

    Small molecule activators of the Trk receptors for neuroprotection

    Get PDF
    The neurotophin signaling network is critical to the development and survival of many neuronal populations. Especially sensitive to imbalances in the neurotrophin system, cholinergic neurons in the basal forebrain are progressively lost in Alzheimer's disease. Therapeutic use of neurotrophins to prevent this loss is hampered, however, by a number of pharmacological challenges. These include a lack of transport across the blood-brain barrier, rapid degradation in the circulation, and difficulty in production. In this review we discuss the evidence supporting the neurotrophin system's role in preventing neurodegeneration and survey some of the pharmacological strategies being pursued to develop effective therapeutics targeting neurotrophin function
    corecore