255 research outputs found

    BMGT 410.01: Sustainable Business Practices

    Get PDF

    The Application Of Impact Dampers To Continuous Systems

    Get PDF
    A study has been made of the application of impact dampers to two types of continuous systems, a simply supported and a clamped beam. Experimental models were tested in the laboratory and computer programs were developed to calculate response by two separate approaches. Results from calculations agreed favorably with experimental tests. Curves presented show the response to be expected for values of significant system parameters and enable the user to apply impact dampers to these types of continuous systems. © 1975 by ASME

    Microarray background correction: maximum likelihood estimation for the normal–exponential convolution

    Get PDF
    Background correction is an important preprocessing step for microarray data that attempts to adjust the data for the ambient intensity surrounding each feature. The “normexp” method models the observed pixel intensities as the sum of 2 random variables, one normally distributed and the other exponentially distributed, representing background noise and signal, respectively. Using a saddle-point approximation, Ritchie and others (2007) found normexp to be the best background correction method for 2-color microarray data. This article develops the normexp method further by improving the estimation of the parameters. A complete mathematical development is given of the normexp model and the associated saddle-point approximation. Some subtle numerical programming issues are solved which caused the original normexp method to fail occasionally when applied to unusual data sets. A practical and reliable algorithm is developed for exact maximum likelihood estimation (MLE) using high-quality optimization software and using the saddle-point estimates as starting values. “MLE” is shown to outperform heuristic estimators proposed by other authors, both in terms of estimation accuracy and in terms of performance on real data. The saddle-point approximation is an adequate replacement in most practical situations. The performance of normexp for assessing differential expression is improved by adding a small offset to the corrected intensities

    Enhanced sequential carrier capture into individual quantum dots and quantum posts controlled by surface acoustic waves

    Full text link
    Individual self-assembled Quantum Dots and Quantum Posts are studied under the influence of a surface acoustic wave. In optical experiments we observe an acoustically induced switching of the occupancy of the nanostructures along with an overall increase of the emission intensity. For Quantum Posts, switching occurs continuously from predominantely charged excitons (dissimilar number of electrons and holes) to neutral excitons (same number of electrons and holes) and is independent on whether the surface acoustic wave amplitude is increased or decreased. For quantum dots, switching is non-monotonic and shows a pronounced hysteresis on the amplitude sweep direction. Moreover, emission of positively charged and neutral excitons is observed at high surface acoustic wave amplitudes. These findings are explained by carrier trapping and localization in the thin and disordered two-dimensional wetting layer on top of which Quantum Dots nucleate. This limitation can be overcome for Quantum Posts where acoustically induced charge transport is highly efficient in a wide lateral Matrix-Quantum Well.Comment: 11 pages, 5 figure

    Mapping Monkeypox Transmission Risk through Time and Space in the Congo Basin

    Get PDF
    Monkeypox is a major public health concern in the Congo Basin area, with changing patterns of human case occurrences reported in recent years. Whether this trend results from better surveillance and detection methods, reduced proportions of vaccinated vs. non-vaccinated human populations, or changing environmental conditions remains unclear. Our objective is to examine potential correlations between environment and transmission of monkeypox events in the Congo Basin. We created ecological niche models based on human cases reported in the Congo Basin by the World Health Organization at the end of the smallpox eradication campaign, in relation to remotely-sensed Normalized Difference Vegetation Index datasets from the same time period. These models predicted independent spatial subsets of monkeypox occurrences with high confidence; models were then projected onto parallel environmental datasets for the 2000s to create present-day monkeypox suitability maps. Recent trends in human monkeypox infection are associated with broad environmental changes across the Congo Basin. Our results demonstrate that ecological niche models provide useful tools for identification of areas suitable for transmission, even for poorly-known diseases like monkeypox.This research was supported by the National Institutes of Health grant 1R01TW008859-01 ("Sylvatic Reservoirs of Human Monkeypox"). Use of trade, product, or firm names does not imply endorsement by the United States Government. The findings and conclusions in this report are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention

    A robust measure of correlation between two genes on a microarray

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The underlying goal of microarray experiments is to identify gene expression patterns across different experimental conditions. Genes that are contained in a particular pathway or that respond similarly to experimental conditions could be co-expressed and show similar patterns of expression on a microarray. Using any of a variety of clustering methods or gene network analyses we can partition genes of interest into groups, clusters, or modules based on measures of similarity. Typically, Pearson correlation is used to measure distance (or similarity) before implementing a clustering algorithm. Pearson correlation is quite susceptible to outliers, however, an unfortunate characteristic when dealing with microarray data (well known to be typically quite noisy.)</p> <p>Results</p> <p>We propose a resistant similarity metric based on Tukey's biweight estimate of multivariate scale and location. The resistant metric is simply the correlation obtained from a resistant covariance matrix of scale. We give results which demonstrate that our correlation metric is much more resistant than the Pearson correlation while being more efficient than other nonparametric measures of correlation (e.g., Spearman correlation.) Additionally, our method gives a systematic gene flagging procedure which is useful when dealing with large amounts of noisy data.</p> <p>Conclusion</p> <p>When dealing with microarray data, which are known to be quite noisy, robust methods should be used. Specifically, robust distances, including the biweight correlation, should be used in clustering and gene network analysis.</p

    Optimizing the noise versus bias trade-off for Illumina whole genome expression BeadChips

    Get PDF
    Five strategies for pre-processing intensities from Illumina expression BeadChips are assessed from the point of view of precision and bias. The strategies include a popular variance stabilizing transformation and model-based background corrections that either use or ignore the control probes. Four calibration data sets are used to evaluate precision, bias and false discovery rate (FDR). The original algorithms are shown to have operating characteristics that are not easily comparable. Some tend to minimize noise while others minimize bias. Each original algorithm is shown to have an innate intensity offset, by which unlogged intensities are bounded away from zero, and the size of this offset determines its position on the noise–bias spectrum. By adding extra offsets, a continuum of related algorithms with different noise–bias trade-offs is generated, allowing direct comparison of the performance of the strategies on equivalent terms. Adding a positive offset is shown to decrease the FDR of each original algorithm. The potential of each strategy to generate an algorithm with an optimal noise–bias trade-off is explored by finding the offset that minimizes its FDR. The use of control probes as part of the background correction and normalization strategy is shown to achieve the lowest FDR for a given bias

    Reliability and validity of a short form household food security scale in a Caribbean community

    Get PDF
    BACKGROUND: We evaluated the reliability and validity of the short form household food security scale in a different setting from the one in which it was developed. METHODS: The scale was interview administered to 531 subjects from 286 households in north central Trinidad in Trinidad and Tobago, West Indies. We evaluated the six items by fitting item response theory models to estimate item thresholds, estimating agreement among respondents in the same households and estimating the slope index of income-related inequality (SII) after adjusting for age, sex and ethnicity. RESULTS: Item-score correlations ranged from 0.52 to 0.79 and Cronbach's alpha was 0.87. Item responses gave within-household correlation coefficients ranging from 0.70 to 0.78. Estimated item thresholds (standard errors) from the Rasch model ranged from -2.027 (0.063) for the 'balanced meal' item to 2.251 (0.116) for the 'hungry' item. The 'balanced meal' item had the lowest threshold in each ethnic group even though there was evidence of differential functioning for this item by ethnicity. Relative thresholds of other items were generally consistent with US data. Estimation of the SII, comparing those at the bottom with those at the top of the income scale, gave relative odds for an affirmative response of 3.77 (95% confidence interval 1.40 to 10.2) for the lowest severity item, and 20.8 (2.67 to 162.5) for highest severity item. Food insecurity was associated with reduced consumption of green vegetables after additionally adjusting for income and education (0.52, 0.28 to 0.96). CONCLUSIONS: The household food security scale gives reliable and valid responses in this setting. Differing relative item thresholds compared with US data do not require alteration to the cut-points for classification of 'food insecurity without hunger' or 'food insecurity with hunger'. The data provide further evidence that re-evaluation of the 'balanced meal' item is required
    corecore