33 research outputs found

    Covariance Based Clustering for Classification

    Get PDF
    Classification involves assigning observations to known classes based on some common features. Commonly known methods for solving classification problems include linear and quadratic discriminant analysis. Both methods assume each class follows a multivariate normal distribution with common covariance matrix for LDA and with class specific covariances for QDA. The equal covariance assumption in LDA is often described as almost unrealistic and simplistic in real-life applications, resulting in less flexibility and high bias. Estimating class-specific covariance matrices becomes a problem when there are large number of classes. This work proposes a method which is a compromise between LDA and QDA. Here a covariance-based clustering method using mixtures of Wishart are used to identify classes that have common covariance matrix. The method is applied to glass fragments classification problem as a means to forensic source identification

    Finite Mixture Modeling for Hierarchically Structured Data with Application to Keystroke Dynamics

    Get PDF
    Keystroke dynamics has been used to both authenticate users of computer systems and detect unauthorized users who attempt to access the system. Monitoring keystroke dynamics adds another level to computer security as passwords are often compromised. Keystrokes can also be continuously monitored long after a password has been entered and the user is accessing the system for added security. Many of the current methods that have been proposed are supervised methods in that they assume that the true user of each keystroke is known apriori. This is not always true for example with businesses and government agencies which have internal systems that multiple people have access to. This implies that unsupervised methods must be employed for these situations. One may propose using finite mixture models to model the keystroke dynamics but we show that there is often not a one-to-one relationship between the number of mixture components and the number of users. Also, users usually type numerous times during the session or block of time while using the system which means the keystroke dynamics from the session can be assumed to have arisen from the same user. We propose a novel method that accounts for the lack of a one-to-one relationship between the number of users and the number of components as well as accounts for known information based on when keystrokes were typed. Based on simulation studies and the motivating real-data example the proposed model shows good performance

    Principal Component Analysis with Application to Credit Card Data

    Get PDF
    Principal Component Analysis (PCA) is a type of dimension reduction technique used in data analysis to process the data before making a model. In general, dimension reduction allows analysts to make conclusions about large data sets by reducing the number of variables while retaining as much information as possible. Using the numerical variables from a data set, PCA aims to compute a smaller set of uncorrelated variables, called principal components, that account for a majority of the variability from the data. The purpose of this poster is to understand PCA as well as perform PCA on a large sample credit card data set. PCA will be explained, and an overview of the data will be given. Then, we will examine the principal components computed after performing PCA. Finally, our conclusions and future work will be stated

    A Characterization of Bias Introduced into Forensic Source Identification when there is a Subpopulation Structure in the Relevant Source Population.

    Get PDF
    In forensic source identification the forensic expert is responsible for providing a summary of the evidence that allows for a decision maker to make a logical and coherent decision concerning the source of some trace evidence of interest. The academic consensus is usually that this summary should take the form of a likelihood ratio (LR) that summarizes the likelihood of the trace evidence arising under two competing propositions. These competing propositions are usually referred to as the prosecution’s proposition, that the specified source is the actual source of the trace evidence, and the defense’s proposition, that another source in a relevant background population is the actual source of the trace evidence. When a relevant background population has a subpopulation structure, the rates of misleading evidence of the LR will tend to vary within the subpopulations, sometimes to an alarming degree. Our preliminary work concerning synthetic and real data indicates that the rates of misleading evidence are different among subpopulations of different sizes, which can lead to a systematic bias when using a LR to present evidence. In this presentation we will summarize our preliminary results for characterizing this bias

    Application of Gaussian Mixture Models to Simulated Additive Manufacturing

    Get PDF
    Additive manufacturing (AM) is the process of building components through an iterative process of adding material in specific designs. AM has a wide range of process parameters that influence the quality of the component. This work applies Gaussian mixture models to detect clusters of similar stress values within and across components manufactured with varying process parameters. Further, a mixture of regression models is considered to simultaneously find groups and also fit regression within each group. The results are compared with a previous naive approach

    Improving the Stability of Red Blood Cells in Rainbow Trout (Oncorhynchus mykiss) and Herring (Clupea harengus): Potential Solutions for Post-mortem Fish Handling to Minimize Lipid Oxidation

    Get PDF
    This study aimed at limiting hemolysis of fish red blood cells (RBCs) as a strategy to limit hemoglobin (Hb)-induced lipid oxidation during post-mortem handling and processing. Effects of varying temperature, salinity, and mechanical impact were studied using washed resuspended RBCs (wr-RBCs) and whole blood (WB) from rainbow trout (Oncorhynchus mykiss) and herring (Clupea harengus). The wr-RBCs were most stable avoiding mechanical stress, keeping isotonic conditions (0.9–1.3% NaCl) and low temperature 0–6 \ub0C, with predicted minimum at 2.5 \ub0C. When compared at the same salinity, it was found that hemolysis was more pronounced in herring than trout wr-RBCs. Furthermore, WB was more stable than wr-RBCs, showing protecting the effects of blood plasma. Studying individual plasma components, stabilizing effects were found from glucose, proteins, and ascorbic acid. This study indicates that small adjustments in the early handling and processing of fish such as changing salinity of storage and rinsing solutions could minimize Hb contamination of the fish muscle and thereby improve quality

    Exploring how plasma- and muscle-related parameters affect trout hemolysis as a route to prevent hemoglobin-mediated lipid oxidation of fish muscle

    Get PDF
    Hemoglobin (Hb) is a powerful promoter of lipid oxidation, particularly in muscle of small pelagic fish species and fish by-products, both having high Hb-levels and highly unsaturated lipids. As Hb is located within the red blood cells (RBCs) it is here hypothesized that the perishable polyunsaturated fatty acids (PUFAs) can be protected from oxidation by limiting hemolysis during early fish processing. Using a model system consisting of washed-resuspended trout (Oncorhynchus mykiss) RBCs (wr-RBCs), the aim of this study was to evaluate how RBC lysis under cold storage was affected by selected parameters linked to blood or muscle: bacterial growth, energy status, pH, RBC membrane lipid oxidation and colloidal osmotic pressure (COP). The results indicated that bacterial growth had a modest effect on hemolysis while pH-values typical for post mortem fish muscle (6.4–6.8), and absence of glucose or albumin stimulated hemolysis. The rapid hemolysis observed at pH 6.4–6.8 correlated with lipid oxidation of the RBC membrane, while the lower hemolysis at pH 7.2–8.0 occurred with low, or without any RBC membrane lipid oxidation. When hemin was added to the RBCs at pH 6.8 hemolysis was induced without parallel RBC membrane oxidation, pointing at Hb-autoxidation and hemin-release per se as important events triggering lysis in fish muscle. Altogether, the study provided valuable findings which ultimately can aid development of new tools to combat lipid oxidation in post mortem fish muscle by limiting hemolysis

    Rethinking ‘Responsibility’ in Precision Agriculture Innovation: Lessons from an Interdisciplinary Research Team

    Get PDF
    We examine the interactions, decisions, and evaluations of an interdisciplinary team of researchers tasked with developing an artificial intelligence-based agricultural decision support system that can provide farmers site-specific information about managing nutrients on their land. We answer the following research questions: (1) How does a relational perspective help an interdisciplinary team conceptualize ‘responsibility\u27 in a project that develops precision agriculture (PA)? and (2) What are some lessons for a research team embarking on a similar interdisciplinary technology development project? We show that how RI is materialized in practice within an interdisciplinary research team can produce different understandings of responsibility, notions of measurement of ‘matter,’ and metrics of success. Future interdisciplinary projects should (1) create mechanisms for project members to see how power and privilege are exercised in the design of new technology and (2) harness social sciences as a bridge between natural sciences and engineering for organic and equitable collaborations

    Augmenting Function for Infarction from Infection: Impella 2.5 for Ischemic Cardiogenic Shock Complicating Sepsis

    Get PDF
    Cardiac dysfunction is a common complication of sepsis in individuals with preexisting coronary disease and portends a poor prognosis when progressing to ischemic cardiogenic shock. In this setting, maximal medical therapy in isolation is often inadequate to maintain cardiac output for patients who are poor candidates for immediate revascularization. Furthermore, the use of vasopressors and inotropes increases myocardial demand and may lead to further injury. Percutaneous ventricular assist devices provide a viable option for management of severe shock with multiorgan failure. The Impella is one of several novel mechanical support systems that can effectively augment cardiac output while reducing myocardial demand and serve as a bridge to recovery from severe hemodynamic compromise. This case report describes the successful utilization of the Impella 2.5 in a patient with baseline profound anemia and coronary artery disease (CAD) presenting in combined distributive and cardiogenic shock associated with a type 2 myocardial infarction complicating sepsis
    corecore