197 research outputs found
The ENHANCE System: Creating Meaningful Sub-Types in a Database Knowledge Representation for Natural Language Generation
The knowledge representation is an important factor in natural language generation since it limits the semantic capabilities of the generation system. It is, however, a tedious task to hand code a knowledge representation which reflects both a user\u27s view of a domain and the way that domain is modelled in the database. A system is presented which uses the contents of the database to form part of a database knowledge representation automatically. It augments a database schema depicting the database structure used for natural language generation. Computational solutions are presented for deriving the information types contained in the schema. Three types of world knowledge axioms are used to ensure that the representation formed is meaningful and contains salient information
Interactions of Fire and Herbivory on the Management of \u3ci\u3eIlex glabra\u3c/i\u3e in Longleaf Pine Forests
The longleaf pine forest is characterized by the high levels of biodiversity and species richness that give it both ecological and economical importance. Decades of clear-cutting, habitat fragmentation, and natural fire restriction have reduced this once great forest system to a mere fraction of its original range. One of the specific causes of the decline of the longleaf pine is understory domination by woody shrubs. These shrubs limit the growth of characteristic grasses and forbs that are responsible for the majority of the biodiversity. The overall purpose of this study was to identify a new longleaf pine reforestation technique that reduces the pervasive presence of Ilex glabra, a woody shrub that can be commonly found dominating the forest floor of many longleaf pine systems. In order to decrease I. glabra ground cover, this study combined prescribed fire regimes with cattle grazing. It was hypothesized that prescribed fire, followed shortly by cattle grazing will reduce the stem density of I. glabra in a manner that is more successful than utilizing fire or cattle alone. The hypothesis was tested by establishing three treatment sites and one control site throughout 160 acres of longleaf pine forest in Hattiesburg, Mississippi. The three treatment sites included prescribed fire only, prescribed fire combined with cattle grazing, and cattle grazing only. Each site was subdivided into a series of permanent, 2m2 sampling plots. I. glabra stem counts, maximum stem height measurements, and stem diameter measurements were completed prior to a prescribed burn in 2016. Two weeks post-burn, cattle were moved onto their designated sites and left to graze for two months. The number of new sprouts that grew from each burned stem was then quantified and recorded for each corresponding stem. After statistical analysis, it was found that prescribed burning significantly increases I. glabra stem v density and that the addition of cattle grazing has no significant effect on stem density. It was concluded that prescribed fire combined with cattle crazing is not an effective reforestation technique in regard to understory management of I. glabra
Recommended from our members
Using Design of Experiments in Finite Element Modeling to Identify Critical Variables for Laser Powder Bed Fusion
Input of accurate material and simulation parameters is critical for accurate predictions in
Laser Powder Bed Fusion (L-PBF) Finite Element Analysis (FEA). It is challenging and
resource consuming to run experiments that measure and control all possible material properties
and process parameters. In this research, we developed a 3-dimensional thermal L-PBF FEA
model for a single track laser scan on one layer of metal powder above a solid metal substrate.
We applied a design of experiments (DOE) approach which varies simulation parameters to
identify critical variables in L-PBF. DOE is an exploratory tool for examining a large number of
factors and alternative modeling approaches. It also determines which approaches can best
predict L-PBF process performance.Mechanical Engineerin
Recommended from our members
Experimental Design for the INL Sample Collection Operational Test
This document describes the test events and numbers of samples comprising the experimental design that was developed for the contamination, decontamination, and sampling of a building at the Idaho National Laboratory (INL). This study is referred to as the INL Sample Collection Operational Test. Specific objectives were developed to guide the construction of the experimental design. The main objective is to assess the relative abilities of judgmental and probabilistic sampling strategies to detect contamination in individual rooms or on a whole floor of the INL building. A second objective is to assess the use of probabilistic and Bayesian (judgmental + probabilistic) sampling strategies to make clearance statements of the form “X% confidence that at least Y% of a room (or floor of the building) is not contaminated. The experimental design described in this report includes five test events. The test events (i) vary the floor of the building on which the contaminant will be released, (ii) provide for varying or adjusting the concentration of contaminant released to obtain the ideal concentration gradient across a floor of the building, and (iii) investigate overt as well as covert release of contaminants. The ideal contaminant gradient would have high concentrations of contaminant in rooms near the release point, with concentrations decreasing to zero in rooms at the opposite end of the building floor. For each of the five test events, the specified floor of the INL building will be contaminated with BG, a stand-in for Bacillus anthracis. The BG contaminant will be disseminated from a point-release device located in the room specified in the experimental design for each test event. Then judgmental and probabilistic samples will be collected according to the pre-specified sampling plan. Judgmental samples will be selected based on professional judgment and prior information. Probabilistic samples will be selected in sufficient numbers to provide desired confidence for detecting contamination or clearing uncontaminated (or decontaminated) areas. Following sample collection for a given test event, the INL building will be decontaminated using Cl2O gas. For possibly contaminated areas (individual rooms or the whole floor of a building), the numbers of probabilistic samples were chosen to provide 95% confidence of detecting contaminated areas of specified sizes. The numbers of judgmental samples were chosen based on guidance from experts in judgmental sampling. For rooms that may be uncontaminated following a contamination event, or for whole floors after decontamination, the numbers of judgmental and probabilistic samples were chosen using a Bayesian approach that provides for combining judgmental and probabilistic samples to make a clearance statement of the form “95% confidence that at least 99% of the room (or floor) is not contaminated”. The experimental design also provides for making 95%/Y% clearance statements using only probabilistic samples, where Y < 99. For each test event, the numbers of samples were selected for a minimal plan (containing fewer samples) and a preferred plan (containing more samples). The preferred plan is recommended over the minimal plan. The preferred plan specifies a total of 1452 samples, 912 after contamination and 540 after decontamination. The minimal plan specifies a total of 1119 samples, 744 after contamination and 375 after decontamination. If the advantages of the “after decontamination” portion of the preferred plan are judged to be small compared to the “after decontamination” portion of the minimal plan, it is an option to combine the “after contamination” portion of the preferred plan (912 samples) with the “after decontamination” portion of the minimal plan (375 samples). This hybrid plan would involve a total of 1287 samples
Using light scattering to evaluate the separation of polydisperse nanoparticles
Appendix A Supplementary data The following are the supplementary data related to this article: Download Appendix A Supplementary data Supplementary data related to this article can be found at http://dx.doi.org/10.1016/j.aca.2015.06.027. Abstract The analysis of natural and otherwise complex samples is challenging and yields uncertainty about the accuracy and precision of measurements. Here we present a practical tool to assess relative accuracy among separation protocols for techniques using light scattering detection. Due to the highly non-linear relationship between particle size and the intensity of scattered light, a few large particles may obfuscate greater numbers of small particles. Therefore, insufficiently separated mixtures may result in an overestimate of the average measured particle size. Complete separation of complex samples is needed to mitigate this challenge. A separation protocol can be considered improved if the average measured size is smaller than a previous separation protocol. Further, the protocol resulting in the smallest average measured particle size yields the best separation among those explored. If the differential in average measured size between protocols is less than the measurement uncertainty, then the selected protocols are of equivalent precision. As a demonstration, this assessment metric is applied to optimization of cross flow (V x ) protocols in asymmetric flow field flow fractionation (AF4) separation interfaced with online quasi-elastic light scattering (QELS) detection using mixtures of polystyrene beads spanning a large size range. Using this assessment metric, the V x parameter was modulated to improve separation until the average measured size of the mixture was in statistical agreement with the calculated average size of particles in the mixture. While we demonstrate this metric by improving AF4V x protocols, it can be applied to any given separation parameters for separation techniques that employ dynamic light scattering detectors. Graphical abstract Highlights • We present a tool to assess relative accuracy among separation protocols. • This metric can be applied to any techniques using light scattering detection. • An improved separation protocol minimizes the average measured particle size. • A protocol with the smallest average measured particle size is the best separation. • Metric is demonstrated by improving AF4 cross flow protocols for polystyrene beads
On the variability of cold region flooding
Cold region hydrological systems exhibit complex interactions with both climate and the cryosphere. Improving knowledge on that complexity is essential to determine drivers of extreme events and to predict changes under altered climate conditions. This is particularly true for cold region flooding where independent shifts in both precipitation and temperature can have significant influence on high flows. This study explores changes in the magnitude and the timing of streamflow in 18 Swedish Sub-Arctic catchments over their full record periods available and a common period (1990-2013). The Mann-Kendall trend test was used to estimate changes in several hydrological signatures (e.g. annual maximum daily flow, mean summer flow, snowmelt onset). Further, trends in the flood frequency were determined by fitting an extreme value type I (Gumbel) distribution to test selected flood percentiles for stationarity using a generalized least squares regression approach.Results highlight shifts from snowmelt-dominated to rainfall-dominated flow regimes with all significant trends (at the 5% significance level) pointing toward (1) lower magnitudes in the spring flood; (2) earlier flood occurrence; (3) earlier snowmelt onset; and (4) decreasing mean summer flows. Decreasing trends in flood magnitude and mean summer flows suggest widespread permafrost thawing and are supported by increasing trends in annual minimum daily flows. Trends in selected flood percentiles showed an increase in extreme events over the full periods of record (significant for only four catchments), while trends were variable over the common period of data among the catchments. An uncertainty analysis emphasizes that the observed trends are highly sensitive to the period of record considered. As such, no clear overall regional hydrological response pattern could be determined suggesting that catchment response to regionally consistent changes in climatic drivers is strongly influenced by their physical characteristics
Development and validation of a visual grading scale for assessing image quality of AP pelvis radiographic images
OBJECTIVE: Apply psychometric theory to develop and validate a visual grading scale for assessing visual perception of AP pelvis digital image quality.
METHODS: Psychometric theory was used to guide scale development. Seven phantom and 7 cadaver images of visually and objectively predetermined quality were used to help assess scale reliability and validity. 151 volunteers scored phantom images; 184 volunteers scored cadaver images. Factor analysis and Cronbach’s alpha were used to assess scale validity and reliability.
RESULTS: A 24 item scale was produced. Aggregated mean volunteer scores for each image correlated with the rank order of the visually and objectively predetermined image qualities. Scale items had good inter-item correlation (≥0.2) and high factor loadings (≥0.3). Cronbach's alpha (reliability) revealed that the scale has acceptable levels of internal reliability for both phantom and cadaver images (α= 0.8 and 0.9, respectively). Factor analysis suggested the scale is multidimensional (assessing multiple quality themes).
CONCLUSION: This study represents the first full development and validation of a visual image quality scale using psychometric theory. It is likely that this scale will have clinical, training and research applications.
ADVANCES IN KNOWLEDGE: This article presents data to create and validate visual grading scales for radiographic examinations. The visual grading scale, for AP pelvis examinations, can act as a validated tool for future research, teaching and clinical evaluations of image quality
Chemical combination effects predict connectivity in biological systems
Efforts to construct therapeutically useful models of biological systems require large and diverse sets of data on functional connections between their components. Here we show that cellular responses to combinations of chemicals reveal how their biological targets are connected. Simulations of pathways with pairs of inhibitors at varying doses predict distinct response surface shapes that are reproduced in a yeast experiment, with further support from a larger screen using human tumour cells. The response morphology yields detailed connectivity constraints between nearby targets, and synergy profiles across many combinations show relatedness between targets in the whole network. Constraints from chemical combinations complement genetic studies, because they probe different cellular components and can be applied to disease models that are not amenable to mutagenesis. Chemical probes also offer increased flexibility, as they can be continuously dosed, temporally controlled, and readily combined. After extending this initial study to cover a wider range of combination effects and pathway topologies, chemical combinations may be used to refine network models or to identify novel targets. This response surface methodology may even apply to non-biological systems where responses to targeted perturbations can be measured
Identification of functional, endogenous programmed −1 ribosomal frameshift signals in the genome of Saccharomyces cerevisiae
In viruses, programmed −1 ribosomal frameshifting (−1 PRF) signals direct the translation of alternative proteins from a single mRNA. Given that many basic regulatory mechanisms were first discovered in viral systems, the current study endeavored to: (i) identify −1 PRF signals in genomic databases, (ii) apply the protocol to the yeast genome and (iii) test selected candidates at the bench. Computational analyses revealed the presence of 10 340 consensus −1 PRF signals in the yeast genome. Of the 6353 yeast ORFs, 1275 contain at least one strong and statistically significant −1 PRF signal. Eight out of nine selected sequences promoted efficient levels of PRF in vivo. These findings provide a robust platform for high throughput computational and laboratory studies and demonstrate that functional −1 PRF signals are widespread in the genome of Saccharomyces cerevisiae. The data generated by this study have been deposited into a publicly available database called the PRFdb. The presence of stable mRNA pseudoknot structures in these −1 PRF signals, and the observation that the predicted outcomes of nearly all of these genomic frameshift signals would direct ribosomes to premature termination codons, suggest two possible mRNA destabilization pathways through which −1 PRF signals could post-transcriptionally regulate mRNA abundance
- …