561 research outputs found
September 22, 1962 Football Program, UOP vs. New Mexico State University
https://scholarlycommons.pacific.edu/ua-football/1300/thumbnail.jp
NASA
Bibliography and photographs of a display of government documents from New Mexico State University Library, Las Cruces, NM.https://cornerstone.lib.mnsu.edu/lib-services-govdoc-display-space/1015/thumbnail.jp
October 14,1961 Football Program, UOP vs New Mexico State University
https://scholarlycommons.pacific.edu/ua-football/1032/thumbnail.jp
Recommended from our members
Application of multidisciplinary analysis to gene expression.
Molecular analysis of cancer, at the genomic level, could lead to individualized patient diagnostics and treatments. The developments to follow will signal a significant paradigm shift in the clinical management of human cancer. Despite our initial hopes, however, it seems that simple analysis of microarray data cannot elucidate clinically significant gene functions and mechanisms. Extracting biological information from microarray data requires a complicated path involving multidisciplinary teams of biomedical researchers, computer scientists, mathematicians, statisticians, and computational linguists. The integration of the diverse outputs of each team is the limiting factor in the progress to discover candidate genes and pathways associated with the molecular biology of cancer. Specifically, one must deal with sets of significant genes identified by each method and extract whatever useful information may be found by comparing these different gene lists. Here we present our experience with such comparisons, and share methods developed in the analysis of an infant leukemia cohort studied on Affymetrix HG-U95A arrays. In particular, spatial gene clustering, hyper-dimensional projections, and computational linguistics were used to compare different gene lists. In spatial gene clustering, different gene lists are grouped together and visualized on a three-dimensional expression map, where genes with similar expressions are co-located. In another approach, projections from gene expression space onto a sphere clarify how groups of genes can jointly have more predictive power than groups of individually selected genes. Finally, online literature is automatically rearranged to present information about genes common to multiple groups, or to contrast the differences between the lists. The combination of these methods has improved our understanding of infant leukemia. While the complicated reality of the biology dashed our initial, optimistic hopes for simple answers from microarrays, we have made progress by combining very different analytic approaches
Recommended from our members
Continuous-wave radar to detect defects within heat exchangers and steam generator tubes.
A major cause of failures in heat exchangers and steam generators in nuclear power plants is degradation of the tubes within them. The tube failure is often caused by the development of cracks that begin on the outer surface of the tube and propagate both inwards and laterally. A new technique was researched for detection of defects using a continuous-wave radar method within metal tubing. The experimental program resulted in a completed product development schedule and the design of an experimental apparatus for studying handling of the probe and data acquisition. These tests were completed as far as the prototypical probe performance allowed. The prototype probe design did not have sufficient sensitivity to detect a defect signal using the defined radar technique and did not allow successful completion of all of the project milestones. The best results from the prototype probe could not detect a tube defect using the radar principle. Though a more precision probe may be possible, the cost of design and construction was beyond the scope of the project. This report describes the probe development and the status of the design at the termination of the project
Predicting Faculty Intentions to Assign Writing in Their Classes
Teachers who offer undergraduate courses agree widely on the importance of writing assignments to further undergraduate education. And yet, there is a great deal of variance among teachers in their writing assignments; some teachers assign no writing whatsoever. To determine the variables that influence the decisions of teachers about whether to assign writing, we predicted their intentions to assign writing from attitudes, subjective norms, perceived control, and perceived difficulty pertaining to assigning writing. Zero-order correlations and hierarchical regression analyses implicate attitude and perceived difficulty as the most important predictors of teacher’s intentions to assign writing in two studies. We also obtained open-ended belief statements in Study 1 and used them to obtain quantitative belief data in Study 2 to find and validate the importance of the impact of particular specific beliefs on intentions to assign writing
Recommended from our members
Statistical validation of engineering and scientific models : bounds, calibration, and extrapolation.
Numerical models of complex phenomena often contain approximations due to our inability to fully model the underlying physics, the excessive computational resources required to fully resolve the physics, the need to calibrate constitutive models, or in some cases, our ability to only bound behavior. Here we illustrate the relationship between approximation, calibration, extrapolation, and model validation through a series of examples that use the linear transient convective/dispersion equation to represent the nonlinear behavior of Burgers equation. While the use of these models represents a simplification relative to the types of systems we normally address in engineering and science, the present examples do support the tutorial nature of this document without obscuring the basic issues presented with unnecessarily complex models
Recommended from our members
Case study for model validation : assessing a model for thermal decomposition of polyurethane foam.
A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized
Recommended from our members
High throughput instruments, methods, and informatics for systems biology.
High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery
- …