48 research outputs found

    DEVELOPMENT OF DIAGNOSTIC AND PROGNOSTIC METHODOLOGIES FOR ELECTRONIC SYSTEMS BASED ON MAHALANOBIS DISTANCE

    Get PDF
    Diagnostic and prognostic capabilities are one aspect of the many interrelated and complementary functions in the field of Prognostic and Health Management (PHM). These capabilities are sought after by industries in order to provide maximum operational availability of their products, maximum usage life, minimum periodic maintenance inspections, lower inventory cost, accurate tracking of part life, and no false alarms. Several challenges associated with the development and implementation of these capabilities are the consideration of a system's dynamic behavior under various operating environments; complex system architecture where the components that form the overall system have complex interactions with each other with feed-forward and feedback loops of instructions; the unavailability of failure precursors; unseen events; and the absence of unique mathematical techniques that can address fault and failure events in various multivariate systems. The Mahalanobis distance methodology distinguishes multivariable data groups in a multivariate system by a univariate distance measure calculated from the normalized value of performance parameters and their correlation coefficients. The Mahalanobis distance measure does not suffer from the scaling effect--a situation where the variability of one parameter masks the variability of another parameter, which happens when the measurement ranges or scales of two parameters are different. A literature review showed that the Mahalanobis distance has been used for classification purposes. In this thesis, the Mahalanobis distance measure is utilized for fault detection, fault isolation, degradation identification, and prognostics. For fault detection, a probabilistic approach is developed to establish threshold Mahalanobis distance, such that presence of a fault in a product can be identified and the product can be classified as healthy or unhealthy. A technique is presented to construct a control chart for Mahalanobis distance for detecting trends and biasness in system health or performance. An error function is defined to establish fault-specific threshold Mahalanobis distance. A fault isolation approach is developed to isolate faults by identifying parameters that are associated with that fault. This approach utilizes the design-of-experiment concept for calculating residual Mahalanobis distance for each parameter (i.e., the contribution of each parameter to a system's health determination). An expected contribution range for each parameter estimated from the distribution of residual Mahalanobis distance is used to isolate the parameters that are responsible for a system's anomalous behavior. A methodology to detect degradation in a system's health using a health indicator is developed. The health indicator is defined as the weighted sum of a histogram bin's fractional contribution. The histogram's optimal bin width is determined from the number of data points in a moving window. This moving window approach is utilized for progressive estimation of the health indicator over time. The health indicator is compared with a threshold value defined from the system's healthy data to indicate the system's health or performance degradation. A symbolic time series-based health assessment approach is developed. Prognostic measures are defined for detecting anomalies in a product and predicting a product's time and probability of approaching a faulty condition. These measures are computed from a hidden Markov model developed from the symbolic representation of product dynamics. The symbolic representation of a product's dynamics is obtained by representing a Mahalanobis distance time series in symbolic form. Case studies were performed to demonstrate the capability of the proposed methodology for real time health monitoring. Notebook computers were exposed to a set of environmental conditions representative of the extremes of their life cycle profiles. The performance parameters were monitored in situ during the experiments, and the resulting data were used as a training dataset. The dataset was also used to identify specific parameter behavior, estimate correlation among parameters, and extract features for defining a healthy baseline. Field-returned computer data and data corresponding to artificially injected faults in computers were used as test data

    Earth Resources: A continuing bibliography with indexes, issue 15, October 1977

    Get PDF
    This bibliography lists 387 reports, articles, and other documents introduced into the NASA scientific and technical information system between July 1 and September 30, 1977. Emphasis is placed on the use of remote sensing and geophysical instrumentation in spacecraft and aircraft to survey and inventory natural resources and urban areas. Subject matter is grouped according to agriculture and forestry, environmental changes and cultural resources, geodesy and cartography, geology and mineral resources, hydrology and water management, data processing and distribution systems, instrumentation and sensors, and economic analysis

    Analysis of microtopography, vegetation, and active-layer thickness using terrestrial LIDAR and kite photography, Barrow, AK

    Get PDF
    Arctic regions underlain by permafrost are among the most vulnerable to impacts from climate change. This study examined changes in the active layer of permafrost near Barrow, Alaska at very fine scale to capture subtle changes related to microtopography and landcover. In 2010, terrestrial LIDAR was used to collect high-resolution elevation data for four 10 m × 10 m plots where maximum active-layer thickness (ALT) and elevation have been monitored on an annual basis since the mid-1990s and had been monitored in the 1960s as well. The raw LIDAR point cloud was analyzed and processed into four 10 cm resolution digital elevation models (DEMs). Elevation data, collected using differential global positioning system (DGPS) to assess heave and subsidence, has been gathered annually since 2004 and was used to assess the accuracy of the DEMs generated for August 2010. Higher-resolution DEMs did not have higher accuracy compared to the DGPS control points due to artifacts inherent in the LIDAR data. The four DEMs were used to classify each plot based on microtopographical variations derived from terrain attributes including elevation, slope, and Melton’s Ruggedness Number (MRN). Landcover at each plot was classified using the Visible Vegetation Index (VVI), calculated from a series of high-resolution (~10 cm) kite photographs obtained in August 2012 by researchers from the University of Texas – El Paso. The microtopography and land-cover classifications were then used to analyze ALT and elevation data from a range of years. Analysis revealed little difference in either dataset based upon microtopography and landcover. The high amount of interclass and interannual variation made it difficult to draw any conclusions about temporal trends. The results suggest that while microtopography and vegetation are important factors within the complex interaction which determines ALT, the scale of analysis made possible by the high-resolution data utilized in this study did not significantly enhance understanding of the main controlling mechanisms. While terrestrial LIDAR is excellent for many applications, particularly those with substantial vertical variability, for future research at this scale on relatively flat topography, airborne LIDAR may be more suitable

    Time series motif discovery

    Get PDF
    Programa doutoral MAP-i em Computer ScienceTime series data are daily produced in massive proportions in virtually every field. Most of the data are stored in time series databases. To find patterns in the databases is an important problem. These patterns, also known as motifs, provide useful insight to the domain expert and summarize the database. They have been widely used in areas as diverse as finance and medicine. Despite there are many algorithms for the task, they typically do not scale and need to set several parameters. We propose a novel algorithm that runs in linear time, is also space efficient and only needs to set one parameter. It fully exploits the state of the art time series representation (SAX _ Symbolic Aggregate Approximation) technique to extract motifs at several resolutions. This property allows the algorithm to skip expensive distance calculations that are typically employed by other algorithms. We also propose an approach to calculate time series motifs statistical significance. Despite there are many approaches in the literature to find time series motifs e_ciently, surprisingly there is no approach that calculates a motifs statistical significance. Our proposal leverages work from the bioinformatics community by using a symbolic definition of time series motifs to derive each motif's p-value. We estimate the expected frequency of a motif by using Markov Chain models. The p-value is then assessed by comparing the actual frequency to the estimated one using statistical hypothesis tests. Our contribution gives means to the application of a powerful technique - statistical tests - to a time series setting. This provides researchers and practitioners with an important tool to evaluate automatically the degree of relevance of each extracted motif. Finally, we propose an approach to automatically derive the Symbolic Aggregate Approximation (iSAX) time series representation's parameters. This technique is widely used in time series data mining. Its popularity arises from the fact that it is symbolic, reduces the dimensionality of the series, allows lower bounding and is space efficient. However, the need to set the symbolic length and alphabet size parameters limits the applicability of the representation since the best parameter setting is highly application dependent. Typically, these are either set to a fixed value (e.g. 8) or experimentally probed for the best configuration. The technique, referred as AutoiSAX, not only discovers the best parameter setting for each time series in the database but also finds the alphabet size for each iSAX symbol within the same word. It is based on the simple and intuitive ideas of time series complexity and standard deviation. The technique can be smoothly embedded in existing data mining tasks as an efficient sub-routine. We analyse the impact of using AutoiSAX in visualization interpretability, classification accuracy and motif mining results. Our contribution aims to make iSAX a more general approach as it evolves towards a parameter-free method.As séries temporais são produzidas diariamente em quantidades massivas em diferentes áreas de trabalho. Estes dados são guardados em bases de dados de séries temporais. Descobrir padrões desconhecidos e repetidos em bases de dados de séries temporais é um desafio pertinente. Estes padrões, também conhecidos como motivos, dão uma nova perspectiva da base de dados, ajudando a explorá-la e sumarizá-la. São frequentemente utilizados em áreas tão diversas como as finanças ou a medicina. Apesar de existirem diversos algoritmos destinados à execução desta tarefa, geralmente não apresentam uma boa escalabilidade e exigem a configuração de vários parâmetros. Propomos, neste trabalho, a criação de um novo algoritmo que executa em tempo linear e que é igualmente eficiente em termos de memória usada, necessitando apenas de um parâmetro. Este algoritmo usufrui da melhor técnica de representação de séries temporais para extrair motivos em várias resoluções (SAX). Esta propriedade permite evitar o cálculo de distâncias que têm um custo computacional muito elevado, cálculo este geralmente presente noutros algoritmos. Nesta tese também fazemos uma proposta para calcular a significância estatística de motivos em séries temporais. Apesar de existirem muitas propostas para a detecção eficiente de motivos em séries temporais, surpreendentemente não existe nenhuma aproximação para calcular a sua significância estatística. A nossa proposta é enriquecida pelo trabalho da área bioinformática, sendo usada uma definição simbólica de motivo para derivar o seu respectivo p-value. Estimamos a frequência esperada de um motivo usando modelos de cadeias de Markov. O p-value associado a um teste estatístico é calculado comparando a frequência real com a frequência estimada de cada padrão. A nossa contribuição permite a aplicação de uma técnica poderosa, testes estatísticos, para a área das séries temporais. Proporciona assim, aos investigadores e utilizadores, uma ferramenta importante para avaliarem, de forma automática, a relevância de cada motivo extraído dos seus dados. Por fim, propomos uma metodologia para derivar de forma automática os parâmetros da representação de séries temporais Symbolic Aggregate Approximation (iSAX). Esta técnica é vastamente utilizada na área de Extracção de Conhecimento em séries temporais. A sua popularidade surge associada ao facto de ser simbólica, de reduzir o tamanho das séries, de permitir aproximar a Distância Euclidiana nas séries originais e ser eficiente em termos de espaço. Contudo, a necessidade de definir os parâmetros comprimento da representação e tamanho do alfabeto limita a sua utilização na prática, uma vez que o parâmetro mais adequado está dependente da área em causa. Normalmente, estes são definidos quer para um valor fixo (por exemplo, 8). A técnica, designada por AutoiSAX, não só extrai a melhor configuração do parâmetro para cada série temporal da base de dados como consegue encontrar a dimensão do alfabeto para cada símbolo iSAX dentro da mesma palavra. Baseia-se em ideias simples e intuitivas como a complexidade das séries temporais e no desvio padrão. A técnica pode ser facilmente incorporada como uma sub-rotina eficiente em tarefas existentes de extracção de conhecimento. Analisamos também o impacto da utilização do AutoiSAX na capacidade interpretativa em tarefas de visualização, exactidão da classificação e na qualidade dos motivos extraídos. A nossa proposta pretende que a iSAX se consolide como uma abordagem mais geral à medida que se vai constituindo como uma metodologia livre de parâmetros.Fundação para a Ciência e Tecnologia (FCT) - SFRH / BD / 33303 / 200

    Exploratory visualization of temporal geospatial data using animation

    Get PDF

    Science-based restoration monitoring of coastal habitats, Volume Two: Tools for monitoring coastal habitats

    Get PDF
    Healthy coastal habitats are not only important ecologically; they also support healthy coastal communities and improve the quality of people’s lives. Despite their many benefits and values, coastal habitats have been systematically modified, degraded, and destroyed throughout the United States and its protectorates beginning with European colonization in the 1600’s (Dahl 1990). As a result, many coastal habitats around the United States are in desperate need of restoration. The monitoring of restoration projects, the focus of this document, is necessary to ensure that restoration efforts are successful, to further the science, and to increase the efficiency of future restoration efforts

    A photogeological study of fracture trace patterns using data processing techniques

    Get PDF
    Imperial Users onl

    Numerical modeling of thermal bar and stratification pattern in Lake Ontario using the EFDC model

    Get PDF
    Thermal bar is an important phenomenon in large, temperate lakes like Lake Ontario. Spring thermal bar formation reduces horizontal mixing, which in turn, inhibits the exchange of nutrients. Evolution of the spring thermal bar through Lake Ontario is simulated using the 3D hydrodynamic model Environmental Fluid Dynamics Code (EFDC). The model is forced with the hourly meteorological data from weather stations around the lake, flow data for Niagara and St. Lawrence rivers, and lake bathymetry. The simulation is performed from April to July, 2011; on a 2-km grid. The numerical model has been calibrated by specifying: appropriate initial temperature and solar radiation attenuation coefficients. The existing evaporation algorithm in EFDC is updated to modified mass transfer approach to ensure correct simulation of evaporation rate and latent heatflux. Reasonable values for mixing coefficients are specified based on sensitivity analyses. The model simulates overall surface temperature profiles well (RMSEs between 1-2°C). The vertical temperature profiles during the lake mixed phase are captured well (RMSEs < 0.5°C), indicating that the model sufficiently replicates the thermal bar evolution process. An update of vertical mixing coefficients is under investigation to improve the summer thermal stratification pattern. Keywords: Hydrodynamics, Thermal BAR, Lake Ontario, GIS

    Assessing the Determinants Facilitating Local Vulnerabilities and Adaptive Capacities to Climate Change Impacts in High Mountain Areas: A Case Study of Northern Ladakh, India

    Get PDF
    Climate change is increasingly redefining the dialectic exchange between human systems and ecological processes. While the rhetoric of climate change is articulated within broad arenas of governance and policy, the realities of climate change are experienced at the local scale. Effective adaptation measures must therefore be commensurate with local resources, needs and objectives while remaining aligned with larger decision-making efforts. The impacts of climate change are heterogeneous and vary with geographic context. Biophysical parameters interface with socioeconomic and political forces to greatly influence the outcome of climate-related risks at the local level. In the high mountains of the western Himalayas for example, climate change is tangibly influencing precipitation patterns, glacial movement and the occurrence of extreme weather events. Rather than work in isolation, these adverse effects exacerbate ongoing stresses related to chronic development and demographic issues. Assessing the nature of biophysical and social vulnerability to climate change, and the initial conditions that differently expose some groups of people over others to climate change impacts, can correspondingly aid in the identification of entry points for adaptation and response planning. This research draws from theoretical traditions couched within geography, political ecology, natural hazards and risk management and climate ethics to assess the multi-scalar factors that aggregate at the local level to shape climate change outcomes. This unique conceptual background directly informs a mixed-methodological design that integrates surveys, climate trend modeling and geospatial mapping to evaluate how climate change is unfolding on-the-ground to influence local engagement around climate change response. In doing so, the key climatic and non-climatic drivers propelling initial conditions of vulnerability are identified as are the determinants facilitating opportunities for adaptation. Research findings suggest access and availability of future water resources will work in tandem with transformations in the wider political economy to significantly determine the long-term ability for many impacted mountain communities to live and thrive. Traditional assumptions of vulnerability are challenged and the need to consider cultural frameworks of social resilience, sense of place and community cohesion are advanced
    corecore