1,175 research outputs found

    Degradation-based reliability in outdoor environments

    Get PDF
    Traditionally, the field of reliability has been concerned with failure time data. As a result, degradation-based reliability methods have not been very well developed. This is especially true of analysis of degradation data resulting from highly variable environments. This dissertation, comprised of three papers, proposes two simulation-based methods to estimate reliability metrics for materials or products that degrade from exposure to the outdoor weather. In the first paper, time series modeling is used to estimate probability distribution of cumulative degradation in x years and probability distribution of failure time. A procedure to construct approximate confidence intervals for metrics of interest is also given. The second paper is an extension of the work presented in the first paper to include the case where there is an additional uncertainty due to unit-to-unit variability. The paper discusses reliability quantities of interest induced by the presence of two sources of variability and techniques to estimate them. Bayesian methods are used to estimate the distribution of the population of units, and an approximation technique to overcome computational difficulties is described. The third paper uses a model-free block bootstrap scheme to estimate reliability quantities in the context of periodic data. The degradation data has periodic structure due to the seasonality of the outdoor environment. The paper also proposes two methods to choose block size. The choice of block size is an important issue in the implementation of a block bootstrap scheme. A comparison is also made between the results from time series modeling and from block bootstrap

    Consistency of the Frequency Domain Bootstrap for differentiable functionals

    Get PDF
    In this paper consistency of the Frequency Domain Bootstrap for differentiable functionals of spectral density function of a linear stationary time series is discussed. The notion of influence function in the time domain on spectral measures is introduced. Moreover, the Fréchet differen-tiability of functionals of spectral measures is defined. Sufficient and necessary conditions for consistency of the FDB in the considered problems are provided and the second order correctness is discussed for some functionals. Finally, validity of the FDB for the empirical processes is considered

    Estimation of the parameters for non-stationary time series with long memory and heavy tails using weak dependence condition

    Get PDF
    Wnioskowanie statystyczne dla nieznanych rozkładów statystyk lub estymatorów można oprzeć na rozkładach asymptotycznych. Niestety, w przypadku danych zależnych, takie procedury statystyczne są¸ niejednokrotnie nieefektywne. Różne są¸ tego przyczyny, np. zbyt ma la liczba danych, nieznana postać rozkładu asymptotycznego, zbyt wolna zbieżność do rozkładu asymptotycznego. Od początku lat osiemdziesiątych ubiegłego wieku intensywnie prowadzone są badania nad rozwojem tzw. metod resamplingowych. Za pomocą tychże metod można bezpośrednio przybliżać nieznane rozkłady statystyk i estymatorów. Idea resamplingu jest prosta. Obliczamy replikacje estymatora i z tych replikacji wyznaczamy rozkład empiryczny tzw. rozkład resamplingowy. Problem, z którym trzeba się zmierzyć badając procedury resamplingowe to ich zgodność, tzn. czy rozkład resamplingowy jest bliski prawdziwemu rozkładowi ? Metod resamplingowych jest wiele. Ich zgodność w przypadku obserwacji niezależnych została dogłębnie zbadana. Przypadek danych stacjonarnych ze swoistą strukturą zależności tzn. silnie mieszających także został zbadany. Przedmiotem intensywnych prac badaczy był również resampling dla niestacjonarnych szeregów czasowych ze specyficzną formą niestacjonarności tzn. okresowych i prawie okresowych. Ostatnie badania nad metodami resamplingowymi koncentrują się głównie na szeregach czasowych ze zdefiniowana¸ przez Paula Doukhana słabą zależnością. W niniejszej pracy został przedstawiony model dla szeregów czasowych, które maja¸ bardzo specyficzne własności tzn.: posiadają długa¸ pamięć, ciężkie ogony (stabilne lub GED) oraz strukturę okresową. Taki model może mieć naturalne zastosowanie w wielu dziedzinach np.: energetyce, wibromechanice, telekomunikacji, klimatologii jak również w ekonomii. Celem pracy jest pokazanie twierdzeń dotyczących zgodności estymatora jednej z metod resamplingowych dla funkcji średniej we wspomnianych powyżej szeregach czasowych. Okazuje się, że jedyną metodą resamplingową, którą można zastosować do danych z długą pamięcią jest subsampling. Polega ona na wyborze z obserwacji wszystkich możliwych podciągów o pewnej długości i wyznaczaniu estymatora na tych podciągach. W pracy sformułowano i udowodniono centralne twierdzenia graniczne, niezbędne do udowodnienia zgodności subsamplingu. Ponadto przedstawiony został przegląd dotychczasowych rezultatów dotyczących metod resamplingowych w szeregach czasowych

    Reliability Analysis By Considering Steel Physical Properties

    Get PDF
    Most customers today are pursuing engineering materials (e.g., steel) that not only can achieve their expected functions but also are highly reliable. As a result, reliability analysis of materials has been receiving increasing attention over the past few decades. Most existing studies in the reliability engineering field focus on developing model-based and data-driven approaches to analyze material reliability based on material failure data such as lifetime data and degradation data, without considering effects of material physical properties. Ignoring such effects may result in a biased estimation of material reliability, which in turn could incur higher operation or maintenance costs. Recently, with the advancement of sensor technology more information/data concerning various physical properties of materials are accessible to reliability researchers. In this dissertation, considering the significant impacts of steel physical properties on steel failures, we propose systematic methodologies for steel reliability analysis by integrating a set of steel physical properties. Specifically, three steel properties of various scales are considered: 1) a macro-scale property called overload retardation; 2) a local-scale property called dynamic local deformation; and 3) a micro-scale property called microstructure effect. For incorporating property 1), a novel physical-statistical model is proposed based on a modification of the current Paris law. To incorporate property 2), a novel statistical model named multivariate general path model is proposed, which is a generalization of an existing univariate general path model. For the integration of property 3), a novel statistical model named distribution-based functional linear model is proposed, which is a generalization of an existing functional linear model. Theoretical property analyses and statistical inferences of these three models are intensively developed. Various simulation studies are implemented to verify and illustrate the proposed methodologies. Multiple physical experiments are designed and conducted to demonstrate the proposed models. The results show that, through the integration of the aforementioned three steel physical properties, a significant improvement of steel reliability assessment is achieved in terms of failure prediction accuracy compared to traditional reliability studies

    Person re-Identification over distributed spaces and time

    Get PDF
    PhDReplicating the human visual system and cognitive abilities that the brain uses to process the information it receives is an area of substantial scientific interest. With the prevalence of video surveillance cameras a portion of this scientific drive has been into providing useful automated counterparts to human operators. A prominent task in visual surveillance is that of matching people between disjoint camera views, or re-identification. This allows operators to locate people of interest, to track people across cameras and can be used as a precursory step to multi-camera activity analysis. However, due to the contrasting conditions between camera views and their effects on the appearance of people re-identification is a non-trivial task. This thesis proposes solutions for reducing the visual ambiguity in observations of people between camera views This thesis first looks at a method for mitigating the effects on the appearance of people under differing lighting conditions between camera views. This thesis builds on work modelling inter-camera illumination based on known pairs of images. A Cumulative Brightness Transfer Function (CBTF) is proposed to estimate the mapping of colour brightness values based on limited training samples. Unlike previous methods that use a mean-based representation for a set of training samples, the cumulative nature of the CBTF retains colour information from underrepresented samples in the training set. Additionally, the bi-directionality of the mapping function is explored to try and maximise re-identification accuracy by ensuring samples are accurately mapped between cameras. Secondly, an extension is proposed to the CBTF framework that addresses the issue of changing lighting conditions within a single camera. As the CBTF requires manually labelled training samples it is limited to static lighting conditions and is less effective if the lighting changes. This Adaptive CBTF (A-CBTF) differs from previous approaches that either do not consider lighting change over time, or rely on camera transition time information to update. By utilising contextual information drawn from the background in each camera view, an estimation of the lighting change within a single camera can be made. This background lighting model allows the mapping of colour information back to the original training conditions and thus remove the need for 3 retraining. Thirdly, a novel reformulation of re-identification as a ranking problem is proposed. Previous methods use a score based on a direct distance measure of set features to form a correct/incorrect match result. Rather than offering an operator a single outcome, the ranking paradigm is to give the operator a ranked list of possible matches and allow them to make the final decision. By utilising a Support Vector Machine (SVM) ranking method, a weighting on the appearance features can be learned that capitalises on the fact that not all image features are equally important to re-identification. Additionally, an Ensemble-RankSVM is proposed to address scalability issues by separating the training samples into smaller subsets and boosting the trained models. Finally, the thesis looks at a practical application of the ranking paradigm in a real world application. The system encompasses both the re-identification stage and the precursory extraction and tracking stages to form an aid for CCTV operators. Segmentation and detection are combined to extract relevant information from the video, while several combinations of matching techniques are combined with temporal priors to form a more comprehensive overall matching criteria. The effectiveness of the proposed approaches is tested on datasets obtained from a variety of challenging environments including offices, apartment buildings, airports and outdoor public spaces

    Reception performance studies for the evaluation and improvement of the new generation terrestrial television systems

    Get PDF
    270 p.La industria de la TV ha experimentado grandes cambios en las últimas décadas. Las expectativas cada vez mayores de los espectadores y la reducción del espectro disponible para los servicios de TV han provocado la necesidad de sistemas más robustos de Televisión Digital Terrestre (TDT).El primer intento de cumplir estos requisitos es el estándar europeo DVB-T2 (2009). La publicación de un nuevo estándar significa el inicio de un proceso de evaluación del rendimiento del mismo mediante, por ejemplo, estudios de cobertura u obtención de valores de umbral de relación señal / ruido (SNR). Al inicio de esta tesis, este proceso estaba casi terminado para recepción fija y móvil. Sin embargo, la recepción en interiores no se había estudiado en detalle. Por esta razón, esta tesis completa la evaluación de DVB-T2 en interiores y define una nueva metodología de evaluación optimizada para este escenario.A pesar de que DVB-T2 emplea tecnologías muy avanzadas, el sistema se definió hace casi diez años y desde entonces han aparecido nuevas técnicas avanzadas, como por ejemplo nuevos códigos de corrección de errores o la nueva técnica de multiplexación por división en capas (LDM). Estas nuevas técnicas tampoco han sido evaluadas en entornos de interior, por lo que esta tesis incluye el análisis de las mismas evaluando su idoneidad para mejorar el rendimiento de DVB-T2. Además, se ha comprobado que los algoritmos tradicionales de los receptores TDT no están optimizados para los nuevos escenarios en los que se consideran las señales multicapa y recepción móvil. Por esta razón, se han propuesto nuevos algoritmos para mejorar la recepción en este tipo de situaciones.El último intento de hacer frente a los altos requisitos actuales de TDT es el estándar americano ATSC 3.0 (2016). Al igual que con DVB-T2, se necesita proceso completo de evaluación del sistema. Por ello, en esta tesis se han realizado simulaciones y pruebas de laboratorio para completar el estudio de rendimiento de ATSC 3.0 en diferentes escenarios

    Environmental Pollution and Chronic Disease Management – A Prognostics Approach

    Get PDF
    No abstract available
    corecore