455 research outputs found
NEAR-SURFACE SOIL NITROGEN AND VEGETATION RESPONSE TO INVASIVE EMERALD ASH BORER IN FORESTED BLACK ASH WETLANDS OF THE WESTERN UPPER PENINSULA, MICHIGAN, USA
Invasive emerald ash borer (EAB) (Agrilus planipennis Fairmaire) poses an imminent threat to the structure and function of North American hardwood forests, particularly black ash (Fraxinus nigra Marshall), and alters the hydrologic and ecological services of their wetlands. Black ash trees regularly grow in seasonally saturated soils and are responsible for hydrologic regulation and nutrient cycling. In this study, a gradient of black ash wetlands impacted by EAB were monitored to assess vegetation changes and near-surface soil nitrogen availability. Vegetation community changes were intertwined with nitrogen cycle disturbances following EAB infestation. As black ash died and fell to the wetland, more total organic nitrogen was returned to the environment and promptly incorporated into the growing shrub and sapling layers. Assessing vegetation and biogeochemical changes along an EAB gradient in the environment improves our understanding of the ecological ramifications for a future landscape without black ash wetlands as they presently exist
Differential Gene Expression in Liver, Gill, and Olfactory Rosettes of Coho Salmon (Oncorhynchus kisutch) After Acclimation to Salinity.
Most Pacific salmonids undergo smoltification and transition from freshwater to saltwater, making various adjustments in metabolism, catabolism, osmotic, and ion regulation. The molecular mechanisms underlying this transition are largely unknown. In the present study, we acclimated coho salmon (Oncorhynchus kisutch) to four different salinities and assessed gene expression through microarray analysis of gills, liver, and olfactory rosettes. Gills are involved in osmotic regulation, liver plays a role in energetics, and olfactory rosettes are involved in behavior. Between all salinity treatments, liver had the highest number of differentially expressed genes at 1616, gills had 1074, and olfactory rosettes had 924, using a 1.5-fold cutoff and a false discovery rate of 0.5. Higher responsiveness of liver to metabolic changes after salinity acclimation to provide energy for other osmoregulatory tissues such as the gills may explain the differences in number of differentially expressed genes. Differentially expressed genes were tissue- and salinity-dependent. There were no known genes differentially expressed that were common to all salinity treatments and all tissues. Gene ontology term analysis revealed biological processes, molecular functions, and cellular components that were significantly affected by salinity, a majority of which were tissue-dependent. For liver, oxygen binding and transport terms were highlighted. For gills, muscle, and cytoskeleton-related terms predominated and for olfactory rosettes, immune response-related genes were accentuated. Interaction networks were examined in combination with GO terms and determined similarities between tissues for potential osmosensors, signal transduction cascades, and transcription factors
Pre-orchiectomy tumor marker levels should not be used for International Germ Cell Consensus Classification (IGCCCG) risk group assignment
PURPOSE
To investigate whether the use of pre-orchiectomy instead of pre-chemotherapy tumor marker (TM) levels has an impact on the International Germ Cell Consensus Classification (IGCCCG) risk group assignment in patients with metastatic germ cell tumors (GCT).
METHODS
Demographic and clinical information of all patients treated for primary metastatic testicular non-seminomatous GCT in our tertiary care academic center were extracted from medical charts. IGCCCG risk group assignment was correctly performed with pre-chemotherapy marker levels and additionally with pre-orchiectomy marker levels. Agreement between pre-chemotherapy and pre-orchiectomy risk group assignments was assessed using Cohen's kappa.
RESULTS
Our cohort consisted of 83 patients. The use of pre-orchiectomy TMs resulted in an IGCCCG risk group upstaging in 12 patients (16%, 8 patients from good to intermediate risk and 4 patients from intermediate to poor risk) and a downstaging in 1 patient (1.2%, from intermediate- to good-risk). The agreement between pre-orchiectomy and pre-chemotherapy IGCCCG risk groups resulted in a Cohen's kappa of 0.888 (p < 0.001).
CONCLUSIONS
Using pre-orchiectomy TMs can result in incorrect IGCCCG risk group assignment, which in turn can impact on the clinical management and follow-up of patients with metastatic GCT. Thus, adherence to the IGCCCG standard using pre-chemotherapy TMs levels is recommended
Symbolic Computation via Program Transformation
Symbolic computation is an important approach in automated program analysis.
Most state-of-the-art tools perform symbolic computation as interpreters and
directly maintain symbolic data. In this paper, we show that it is feasible,
and in fact practical, to use a compiler-based strategy instead. Using compiler
tooling, we propose and implement a transformation which takes a standard
program and outputs a program that performs semantically equivalent, but
partially symbolic, computation. The transformed program maintains symbolic
values internally and operates directly on them hence the program can be
processed by a tool without support for symbolic manipulation.
The main motivation for the transformation is in symbolic verification, but
there are many other possible use-cases, including test generation and concolic
testing. Moreover using the transformation simplifies tools, since the symbolic
computation is handled by the program directly. We have implemented the
transformation at the level of LLVM bitcode. The paper includes an experimental
evaluation, based on an explicit-state software model checker as a verification
backend
Competition, predation, and migration: individual choice patterns of Serengeti migrants captured by hierarchical models
Large-herbivore migrations occur across gradients of food quality or food abundance that are generally determined by underlying geographic patterns in rainfall, elevation, or latitude, in turn causing variation in the degree of interspecific competition and the exposure to predators. However, the role of top-down effects of predation as opposed to the bottom-up effects of competition for resources in shaping migrations is not well understood. We studied 30 GPS radio-collared wildebeest and zebra migrating seasonally in the Serengeti-Mara ecosystem to ask how predation and food availability differentially affect the individual movement patterns of these co-migrating species. A hierarchical analysis of movement trajectories (directions and distances) in relation to grass biomass, high-quality food patches, and predation risk show that wildebeest tend to move in response to food quality, with little attention to predation risk. In contrast, individual zebra movements reflect a balance between the risk of predation and the access to high-quality food of sufficient biomass. Our analysis shows how two migratory species move in response to different attributes of the same landscape. Counterintuitively and in contrast to most other animal movement studies, we find that both species move farther each day when resources are locally abundant than when they are scarce. During the wet season when the quality of grazing is at its peak, both wildebeest and zebra move the greatest distances and do not settle in localized areas to graze for extended periods. We propose that this punctuated movement in highquality patches is explained by density dependency, whereby large groups of competing individuals (up to 1.65 million grazers) rapidly deplete the localized grazing opportunities. These findings capture the roles of predation and competition in shaping animal migrations, which are often claimed but rarely measured
Toxicity of Pb‐Contaminated Soil to Japanese Quail (\u3ci\u3eCoturnix japonica\u3c/i\u3e) and the Use of the Blood–dietary Pb Slope in Risk Assessment
This study relates tissue concentrations and toxic effects of Pb in Japanese quail (Coturnix japonica) to the dietary exposure of soil‐borne Pb associated with mining and smelting. From 0% to 12% contaminated soil, by weight, was added to 5 experimental diets (0.12 to 382mg Pb/kg, dry wt) and fed to the quail for 6 weeks. Benchmark doses associated with a 50% reduction in delta‐aminolevulinic acid dehydratase activity were 0.62mg Pb/kg in the blood, dry wt, and 27mg Pb/kg in the diet. Benchmark doses associated with a 20% increase in the concentration of erythrocyte protoporphyrin were 2.7mg Pb/kg in the blood and 152mg Pb/kg in the diet. The quail showed no other signs of toxicity (histopathological lesions, alterations in plasma–testosterone concentration, and body and organ weights). The relation of the blood Pb concentration to the soil Pb concentration was linear, with a slope of 0.013mg Pb/kg of blood (dry wt) divided by mg Pb/kg of diet. We suggest that this slope is potentially useful in ecological risk assessments on birds in the same way that the intake slope factor is an important parameter in risk assessments of children exposed to Pb. The slope may also be used in a tissue‐residue approach as an additional line of evidence in ecological risk assessment, supplementary to an estimate of hazard based on dietary toxicity reference values
Toxicity of Pb‐Contaminated Soil to Japanese Quail (\u3ci\u3eCoturnix japonica\u3c/i\u3e) and the Use of the Blood–dietary Pb Slope in Risk Assessment
This study relates tissue concentrations and toxic effects of Pb in Japanese quail (Coturnix japonica) to the dietary exposure of soil‐borne Pb associated with mining and smelting. From 0% to 12% contaminated soil, by weight, was added to 5 experimental diets (0.12 to 382mg Pb/kg, dry wt) and fed to the quail for 6 weeks. Benchmark doses associated with a 50% reduction in delta‐aminolevulinic acid dehydratase activity were 0.62mg Pb/kg in the blood, dry wt, and 27mg Pb/kg in the diet. Benchmark doses associated with a 20% increase in the concentration of erythrocyte protoporphyrin were 2.7mg Pb/kg in the blood and 152mg Pb/kg in the diet. The quail showed no other signs of toxicity (histopathological lesions, alterations in plasma–testosterone concentration, and body and organ weights). The relation of the blood Pb concentration to the soil Pb concentration was linear, with a slope of 0.013mg Pb/kg of blood (dry wt) divided by mg Pb/kg of diet. We suggest that this slope is potentially useful in ecological risk assessments on birds in the same way that the intake slope factor is an important parameter in risk assessments of children exposed to Pb. The slope may also be used in a tissue‐residue approach as an additional line of evidence in ecological risk assessment, supplementary to an estimate of hazard based on dietary toxicity reference values
Bioaccessibility Tests Accurately Estimate Bioavailability Of Lead To Quail
Hazards of soil-borne lead (Pb) to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, the authors measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from 5 Pb-contaminated Superfund sites had relative bioavailabilities from 33% to 63%, with a mean of approximately 50%. Treatment of 2 of the soils with phosphorus (P) significantly reduced the bioavailability of Pb. Bioaccessibility of Pb in the test soils was then measured in 6 in vitro tests and regressed on bioavailability: the relative bioavailability leaching procedure at pH 1.5, the same test conducted at pH 2.5, the Ohio State University in vitro gastrointestinal method, the urban soil bioaccessible lead test, the modified physiologically based extraction test, and the waterfowl physiologically based extraction test. All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the relative bioavailability leaching procedure at pH 2.5 and Ohio State University in vitro gastrointestinal tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter (24%), or present as Pb sulfate (18%). Additional Pb was associated with P (chloropyromorphite, hydroxypyromorphite, and tertiary Pb phosphate) and with Pb carbonates, leadhillite (a lead sulfate carbonate hydroxide), and Pb sulfide. The formation of chloropyromorphite reduced the bioavailability of Pb, and the amendment of Pb-contaminated soils with P may be a thermodynamically favored means to sequester Pb
Foot Injuries in Michigan, USA, Gray Wolves (\u3ci\u3eCanis lupus\u3c/i\u3e), 1992–2014
The range of gray wolves (Canis lupus) in the contiguous US is expanding. Research and monitoring to support population recovery and management often involves capture via foothold traps. A population-level epidemiologic assessment of the effect of trap injuries on wolf survival remains needed to inform management. We describe the baseline rate, type, and severity of foot injuries of wolves born 1992–2013 in Michigan’s Upper Peninsula, evaluate the reliability of field-scoring trap-related injuries, and the effect of injuries on wolf survival. We assessed foot injuries by physical and radiographic exam at postmortem and/or time of capture for 351 wolves using the International Organization for Standardization 10990-5 standard and the effects of injuries, sex, age, previous capture and body condition on survival using proportional hazards regression. We used ordinal regression to evaluate epidemiologic associations between sex, age, previous capture, body condition, cause of death and injury severity. Most wolves (53%) experienced no physically or radiographically discernable foot injuries over their lifetimes. Among those wolves that did experience injuries, 33% scored as mild. Foot injuries had little epidemiologically discernable effect on survival rates. Wolves with higher foot trauma scores did experience an increased risk of dying, but the magnitude of the increase was modest. Most limb injuries occurred below the carpus or tarsus, and scoring upper-limb injuries added little predictive information to population-level epidemiologic measures of survival and injury severity. There was little association between injury severity and cause of death. Based on necropsy exams, previous trap injuries likely contributed to death in only four wolves (1.1%). Our results suggest that injuries resulting from foothold traps are unlikely to be a limiting factor in recovery and ongoing survival of the Michigan gray wolf population
Impact of differing methodologies for serum miRNA-371a-3p assessment in stage I testicular germ cell cancer recurrence
INTRODUCTION
Current evidence shows that serum miR-371a-3p can identify disease recurrence in testicular germ cell tumour (TGCT) patients and correlates with tumour load. Despite convincing evidence showing the advantages of including miR-371a-3p testing to complement and overcome the classical serum tumour markers limitations, the successful introduction of a serum miRNA based test into clinical practice has been impeded by a lack of consensus regarding optimal methodologies and lack of a universal protocol and thresholds. Herein, we investigate two quantitative real-time PCR (qRT-PCR) based pipelines in detecting disease recurrence in stage I TGCT patients under active surveillance, and compare the sensitivity and specificity for each method.
METHODS
Sequential serum samples collected from 33 stage I TGCT patients undergoing active surveillance were analysed for miR-371a-3p via qRT-PCR with and without an amplification step included.
RESULTS
Using a pre-amplified protocol, all known recurrences were detected via elevated miR-371a-3p expression, while without pre-amplification, we failed to detect recurrence in 3/10 known recurrence patients. For pre-amplified analysis, sensitivity and specificity was 90% and 94.4% respectively. Without amplification, sensitivity dropped to 60%, but exhibited 100% specificity.
DISCUSSION
We conclude that incorporating pre-amplification increases sensitivity of miR-371a-3p detection, but produces more false positive results. The ideal protocol for quantification of miR-371a-3p still needs to be determined. TGCT patients undergoing active surveillance may benefit from serum miR-371a-3p quantification with earlier detection of recurrences compared to current standard methods. However, larger cross-institutional studies where samples are processed and data is analysed in a standardised manner are required prior to its routine clinical implementation
- …