471,023 research outputs found

    Stability comparison of dimensionality reduction techniques attending to data and parameter variations

    Get PDF
    The analysis of the big volumes of data requires efficient and robust dimension reduction techniques to represent data into lower-dimensional spaces, which ease human understanding. This paper presents a study of the stability, robustness and performance of some of these dimension reduction algorithms with respect to algorithm and data parameters, which usually have a major influence in the resulting embeddings. This analysis includes the performance of a large panel of techniques on both artificial and real datasets, focusing on the geometrical variations experimented when changing different parameters. The results are presented by identifying the visual weaknesses of each technique, providing some suitable data-processing tasks to enhance the stabilit

    An analytically linearized helicopter model with improved modeling accuracy

    Get PDF
    An analytically linearized model for helicopter flight response including rotor blade dynamics and dynamic inflow, that was recently developed, was studied with the objective of increasing the understanding, the ease of use, and the accuracy of the model. The mathematical model is described along with a description of the UH-60A Black Hawk helicopter and flight test used to validate the model. To aid in utilization of the model for sensitivity analysis, a new, faster, and more efficient implementation of the model was developed. It is shown that several errors in the mathematical modeling of the system caused a reduction in accuracy. These errors in rotor force resolution, trim force and moment calculation, and rotor inertia terms were corrected along with improvements to the programming style and documentation. Use of a trim input file to drive the model is examined. Trim file errors in blade twist, control input phase angle, coning and lag angles, main and tail rotor pitch, and uniform induced velocity, were corrected. Finally, through direct comparison of the original and corrected model responses to flight test data, the effect of the corrections on overall model output is shown

    Rapid Targeted Gene Disruption in Bacillus Anthracis

    Get PDF
    Anthrax is a zoonotic disease recognized to affect herbivores since Biblical times and has the widest range of susceptible host species of any known pathogen. The ease with which the bacterium can be weaponized and its recent deliberate use as an agent of terror, have highlighted the importance of gaining a deeper understanding and effective countermeasures for this important pathogen. High quality sequence data has opened the possibility of systematic dissection of how genes distributed on both the bacterial chromosome and associated plasmids have made it such a successful pathogen. However, low transformation efficiency and relatively few genetic tools for chromosomal manipulation have hampered full interrogation of its genome. Results: Group II introns have been developed into an efficient tool for site-specific gene inactivation in several organisms. We have adapted group II intron targeting technology for application in Bacillus anthracis and generated vectors that permit gene inactivation through group II intron insertion. The vectors developed permit screening for the desired insertion through PCR or direct selection of intron insertions using a selection scheme that activates a kanamycin resistance marker upon successful intron insertion. Conclusions: The design and vector construction described here provides a useful tool for high throughput experimental interrogation of the Bacillus anthracis genome and will benefit efforts to develop improved vaccines and therapeutics.Chem-Bio Diagnostics program from the Department of Defense Chemical and Biological Defense program through the Defense Threat Reduction Agency (DTRA) B102387MNIH GM037949Welch Foundation F-1607Cellular and Molecular Biolog

    Data-worth analysis through probabilistic collocation-based Ensemble Kalman Filter

    Get PDF
    We propose a new and computationally efficient data-worth analysis and quantification framework keyed to the characterization of target state variables in groundwater systems. We focus on dynamically evolving plumes of dissolved chemicals migrating in randomly heterogeneous aquifers. An accurate prediction of the detailed features of solute plumes requires collecting a substantial amount of data. Otherwise, constraints dictated by the availability of financial resources and ease of access to the aquifer system suggest the importance of assessing the expected value of data before these are actually collected. Data-worth analysis is targeted to the quantification of the impact of new potential measurements on the expected reduction of predictive uncertainty based on a given process model. Integration of the Ensemble Kalman Filter method within a data-worth analysis framework enables us to assess data worth sequentially, which is a key desirable feature for monitoring scheme design in a contaminant transport scenario. However, it is remarkably challenging because of the (typically) high computational cost involved, considering that repeated solutions of the inverse problem are required. As a computationally efficient scheme, we embed in the data-worth analysis framework a modified version of the Probabilistic Collocation Method-based Ensemble Kalman Filter proposed by Zeng et al. (2011) so that we take advantage of the ability to assimilate data sequentially in time through a surrogate model constructed via the polynomial chaos expansion. We illustrate our approach on a set of synthetic scenarios involving solute migrating in a two-dimensional random permeability field. Our results demonstrate the computational efficiency of our approach and its ability to quantify the impact of the design of the monitoring network on the reduction of uncertainty associated with the characterization of a migrating contaminant plume

    PENERAPAN APLIKASI AT-TASHIL PADA MATERI AL-MAWARIS PADA BALAI DIKLAT KEAGAMAAN ACEH TAHUN 2020

    Get PDF
    The problem in learning today is that the skills of teachers in using varied media are not optimal so that the learning process seems less effective and efficient. In the learning process of Islamic Education in mawaris material, teachers are more dominant in using conventional media so that the learning outcomes of students are low. This study was intended to determine the effectiveness of the application of At-Tashil-based learning media in Islamic Islamic Studies learning material according to alumni of the High School Teacher Substantive Technical Training at the Aceh Religious Education and Training Center. This research uses a qualitative approach with a descriptive type. The data collection technique consisted of instruments, interviews, and observations. The data analysis technique was carried out in the data reduction phase, data presentation, and conclusion drawing. The results of the research findings indicate that the use of At-Tashil application-based learning media in Islamic Fiqh Mawaris learning for high school teachers seems effective in terms of use, ease, and accuracy of the results

    Randomized Matrix Decompositions Using R

    Get PDF
    Matrix decompositions are fundamental tools in the area of applied mathematics, statistical computing, and machine learning. In particular, low-rank matrix decompositions are vital, and widely used for data analysis, dimensionality reduction, and data compression. Massive datasets, however, pose a computational challenge for traditional algorithms, placing significant constraints on both memory and processing power. Recently, the powerful concept of randomness has been introduced as a strategy to ease the computational load. The essential idea of probabilistic algorithms is to employ some amount of randomness in order to derive a smaller matrix from a high-dimensional data matrix. The smaller matrix is then used to compute the desired low-rank approximation. Such algorithms are shown to be computationally efficient for approximating matrices with low-rank structure. We present the R package rsvd, and provide a tutorial introduction to randomized matrix decompositions. Specifically, randomized routines for the singular value decomposition, (robust) principal component analysis, interpolative decomposition, and CUR decomposition are discussed. Several examples demonstrate the routines, and show the computational advantage over other methods implemented in R
    • …
    corecore