51,211 research outputs found

    DEVELOPMENT OF DIAGNOSTIC AND PROGNOSTIC METHODOLOGIES FOR ELECTRONIC SYSTEMS BASED ON MAHALANOBIS DISTANCE

    Get PDF
    Diagnostic and prognostic capabilities are one aspect of the many interrelated and complementary functions in the field of Prognostic and Health Management (PHM). These capabilities are sought after by industries in order to provide maximum operational availability of their products, maximum usage life, minimum periodic maintenance inspections, lower inventory cost, accurate tracking of part life, and no false alarms. Several challenges associated with the development and implementation of these capabilities are the consideration of a system's dynamic behavior under various operating environments; complex system architecture where the components that form the overall system have complex interactions with each other with feed-forward and feedback loops of instructions; the unavailability of failure precursors; unseen events; and the absence of unique mathematical techniques that can address fault and failure events in various multivariate systems. The Mahalanobis distance methodology distinguishes multivariable data groups in a multivariate system by a univariate distance measure calculated from the normalized value of performance parameters and their correlation coefficients. The Mahalanobis distance measure does not suffer from the scaling effect--a situation where the variability of one parameter masks the variability of another parameter, which happens when the measurement ranges or scales of two parameters are different. A literature review showed that the Mahalanobis distance has been used for classification purposes. In this thesis, the Mahalanobis distance measure is utilized for fault detection, fault isolation, degradation identification, and prognostics. For fault detection, a probabilistic approach is developed to establish threshold Mahalanobis distance, such that presence of a fault in a product can be identified and the product can be classified as healthy or unhealthy. A technique is presented to construct a control chart for Mahalanobis distance for detecting trends and biasness in system health or performance. An error function is defined to establish fault-specific threshold Mahalanobis distance. A fault isolation approach is developed to isolate faults by identifying parameters that are associated with that fault. This approach utilizes the design-of-experiment concept for calculating residual Mahalanobis distance for each parameter (i.e., the contribution of each parameter to a system's health determination). An expected contribution range for each parameter estimated from the distribution of residual Mahalanobis distance is used to isolate the parameters that are responsible for a system's anomalous behavior. A methodology to detect degradation in a system's health using a health indicator is developed. The health indicator is defined as the weighted sum of a histogram bin's fractional contribution. The histogram's optimal bin width is determined from the number of data points in a moving window. This moving window approach is utilized for progressive estimation of the health indicator over time. The health indicator is compared with a threshold value defined from the system's healthy data to indicate the system's health or performance degradation. A symbolic time series-based health assessment approach is developed. Prognostic measures are defined for detecting anomalies in a product and predicting a product's time and probability of approaching a faulty condition. These measures are computed from a hidden Markov model developed from the symbolic representation of product dynamics. The symbolic representation of a product's dynamics is obtained by representing a Mahalanobis distance time series in symbolic form. Case studies were performed to demonstrate the capability of the proposed methodology for real time health monitoring. Notebook computers were exposed to a set of environmental conditions representative of the extremes of their life cycle profiles. The performance parameters were monitored in situ during the experiments, and the resulting data were used as a training dataset. The dataset was also used to identify specific parameter behavior, estimate correlation among parameters, and extract features for defining a healthy baseline. Field-returned computer data and data corresponding to artificially injected faults in computers were used as test data

    Classifiers With a Reject Option for Early Time-Series Classification

    Full text link
    Early classification of time-series data in a dynamic environment is a challenging problem of great importance in signal processing. This paper proposes a classifier architecture with a reject option capable of online decision making without the need to wait for the entire time series signal to be present. The main idea is to classify an odor/gas signal with an acceptable accuracy as early as possible. Instead of using posterior probability of a classifier, the proposed method uses the "agreement" of an ensemble to decide whether to accept or reject the candidate label. The introduced algorithm is applied to the bio-chemistry problem of odor classification to build a novel Electronic-Nose called Forefront-Nose. Experimental results on wind tunnel test-bed facility confirms the robustness of the forefront-nose compared to the standard classifiers from both earliness and recognition perspectives

    Federated Embedded Systems – a review of the literature in related fields

    Get PDF
    This report is concerned with the vision of smart interconnected objects, a vision that has attracted much attention lately. In this paper, embedded, interconnected, open, and heterogeneous control systems are in focus, formally referred to as Federated Embedded Systems. To place FES into a context, a review of some related research directions is presented. This review includes such concepts as systems of systems, cyber-physical systems, ubiquitous computing, internet of things, and multi-agent systems. Interestingly, the reviewed fields seem to overlap with each other in an increasing number of ways

    Science : programme of study for Key Stage 4, February 2013 [draft]

    Get PDF

    Problematising upstream technology through speculative design: the case of quantified cats and dogs

    Get PDF
    There is growing interest in technology that quantifies aspects of our lives. This paper draws on critical practice and speculative design to explore, question and problematise the ultimate consequences of such technology using the quantification of companion animals (pets) as a case study. We apply the concept of ‘moving upstream’ to study such technology and use a qualitative research approach in which both pet owners, and animal behavioural experts, were presented with, and asked to discuss, speculative designs for pet quantification applications, the design of which were extrapolated from contemporary trends. Our findings indicate a strong desire among pet owners for technology that has little scientific justification, whilst our experts caution that the use of technology to augment human-animal communication has the potential to disimprove animal welfare, undermine human-animal bonds, and create human-human conflicts. Our discussion informs wider debates regarding quantification technology

    Vampires, Viruses and Verbalisation: Bram Stoker’s Dracula as a genealogical window into fin-de-siècle science

    Get PDF
    This paper considers Bram Stoker’s novel Dracula, published in 1897, as a window into techno-scientific and sociocultural developments of the fin-de-siècle era, ranging from blood transfusion and virology up to communication technology and brain research, but focusing on the birth of psychoanalysis in 1897, the year of publication. Stoker’s literary classic heralds a new style of scientific thinking, foreshadowing important aspects of post-1900 culture. Dracula reflects a number of scientific events which surfaced in the 1890s but evolved into major research areas that are still relevant today. Rather than seeing science and literature as separate realms, moreover, Stoker’s masterpiece encourages us to address the ways in which techno-scientific and psycho- cultural developments mutually challenge and mirror one another, so that we may use his novel to deepen our understanding of emerging research practices and vice versa (Zwart 2008, 2010). Psychoanalysis plays a double role in this. It is the research field whose genealogical constellation is being studied, but at the same time (Lacanian) psychoanalysis guides my reading strategy. Dracula, the infectious, undead Vampire has become an archetypal cinematic icon and has attracted the attention of numerous scholars (Browning & Picart 2009). The vampire complex built on various folkloristic and literary sources and culminated in two famous nineteenth-century literary publications: the story The Vampyre by John Polidori (published in 1819)2 and Stoker’s version. Most of the more than 200 vampire movies released since Nosferatu (1922) are based on the latter (Skal 1990; Browning & Picart 2009; Melton 2010; Silver & Ursini 2010). Yet, rather than on the archetypal cinematic image of the Vampire, I will focus on the various scientific ideas and instruments employed by Dracula’s antagonists to overcome the threat to civilisation he represents. Although the basic storyline is well-known, I will begin with a plot summary

    Mathematical skills in the workplace: final report to the Science Technology and Mathematics Council

    Get PDF

    Comparing knowledge bases: on the organisation and geography of knowledge flows in the regional innovation system of Scania, southern Sweden

    Get PDF
    This paper deals with knowledge flows and collaboration between firms in the regional innovation system of southern Sweden. It focuses on industries which draw on different types of knowledge bases. The aim is to analyse how the functional and spatial organisation of knowledge interdependencies among firms and other actors vary between different types of industries which are part of the same regional innovation system. We argue that knowledge sourcing and exchange in geographical proximity is especially important for industries that rely on a synthetic or symbolic knowledge base, since the interpretation of the knowledge they deal with tend to differ between places. This is less the case for industries drawing on an analytical knowledge base, which rely more on scientific knowledge that is codified, abstract and universal, and therefore less sensitive to geographical distance. Thus, geographic clustering of firms in analytical industries builds on other rationale than the need of proximity for knowledge sourcing and exchange. To analyse these assumptions empirically, we draw on data from three case studies of firm clusters in the region of southern Sweden: (1) the life science cluster represents an analytical (science) based industry, (2) the food cluster includes mainly synthetic (engineering) based industries, and (3) the moving media cluster is considered as symbolic (artistic) based. Knowledge sourcing and knowledge exchange in each of the cases are explored and compared using social network analysis in association with a dataset gathered through interviews with firm representatives.knowledge bases; life science; food cluster; moving media; Sweden
    corecore