1,985 research outputs found

    Assessing Software Reliability Using Modified Genetic Algorithm: Inflection S-Shaped Model

    Get PDF
    In order to assess software reliability, many software reliability growth models (SRGMs) have been proposed in the past four decades. In principle, two widely used methods for the parameter estimation of SRGMs are the maximum likelihood estimation (MLE) and the least squares estimation (LSE). However, the approach of these two estimations may impose some restrictions on SRGMs, such as the existence of derivatives from formulated models or the needs for complex calculation. In this paper, we propose a modified genetic algorithm (MGA) to assess the reliability of software considering the Time domain software failure data using Inflection S-shaped model which is NonHomogenous Poisson Process (NHPP) based. Experiments based on real software failure data are performed, and the results show that the proposed genetic algorithm is more effective and faster than traditional algorithms

    A Comparative Analysis of Software Reliability Growth Models using defects data of Closed and Open Source Software

    Get PDF
    The purpose of this study is to compare the fitting (goodness of fit) and prediction capability of eight Software Reliability Growth Models (SRGM) using fifty different failure Data sets. These data sets contain defect data collected from system test phase, operational phase (field defects) and Open Source Software (OSS) projects. The failure data are modelled by eight SRGM (Musa Okumoto, Inflection S-Shaped, Goel Okumoto, Delayed S-Shaped, Logistic, Gompertz, Yamada Exponential, and Generalized Goel Model). These models are chosen due to their prevalence among many software reliability models. The results can be summarized as follows -Fitting capability: Musa Okumoto fits all data sets, but all models fit all the OSS datasets -Prediction capability: Musa Okumoto, Inflection S- Shaped and Goel Okumoto are the best predictors for industrial data sets, Gompertz and Yamada are the best predictors for OSS data sets - Fitting and prediction capability: Musa Okumoto and Inflection are the best performers on industrial datasets. However this happens only on slightly more than 50% of the datasets. Gompertz and Inflection are the best performers for all OSS dataset

    Application of model-based LPV actuator fault estimation for an industrial benchmark

    Get PDF
    To bridge the gap between model-based fault diagnosis theory and the industry practice, a linear parameter varying H_/H∞ fault estimation approach is applied to a high fidelity nonlinear aircraft benchmark, to deal with the various actuator fault detection scenarios which can result in the abnormal aircraft configuration. To facilitate the industry calculating the computational load of the fault estimation approach, the design is fully coded using the flight control computer software library. Furthermore, the robustness performance of the fault estimation approach is evaluated using the parametric simulation and the Monte Carlo campaign supported by a functioning engineering simulator despite the aerodynamic database uncertainties and measurements errors over a wide range of the flight envelope

    Earthquake Early Warning System (EEWs) for the New Madrid Seismic Zone

    Get PDF
    Part 1: Research in the last decade on Earthquake Early Warning Systems (EEWSs) has undergone rapid development in terms of theoretical and methodological advances in real time data analysis, improved telemetry, and computer technology and is becoming a useful tool for practical real time seismic hazard mitigation. The main focus of this study is to undertake a feasibility study of an EEWS for the New Madrid Seismic Zone (NMSZ) from the standpoint of source location. Magnitude determination is addressed in a separate paper. The NMSZ covers a wide area with several heavily populated cities, vital infrastructures, and facilities located within a radius of less than 70 km from the epicenters of the 1811-1812 earthquakes. One of the challenges associated with the NMSZ is that while low to moderate levels of seismic activity are common, larger earthquakes are rare (i.e. there are no instrumentally recorded data for earthquakes with magnitudes greater than M5.5 in the NMSZ). We also recognize that it may not be realistic to provide early warning for all possible sources as is done on the west coast U.S. and we therefore focus on a specific source zone. We examine the stations within the NMSZ in order to answer the question What changes should be applied to the NMSZ network to make it suitable for earthquake early warning (EEW). We also explore needed changes to the Advanced National Seismic System (ANSS) Earthquake Monitoring System Real Time (AQMS RT) data acquisition system to make it useful for EEW. Our results show that EEW is feasible, though several technical challenges remain in incorporating its use with the present network.Part 2: Increasing vulnerability of metropolitan areas within stable continental regions (SCR), such as Memphis, TN and St. Louis, MO near the New Madrid Seismic Zone (NMSZ), to earthquakes and the very low probability level at which short term earthquake forecasting is possible make an earthquake early warning system (EEWS) a viable alternative for effective real-time risk reduction in these cities. In this study, we explore practical approaches to earthquake early warning (EEWS), and test the adaptability and potential of the real-time monitoring system in the NMSZ. We determine empirical relations based on amplitude and frequency magnitude proxies from the initial four seconds of the P-waveform records available from the Cooperative New Madrid Seismic Network (CNMSN) database for magnitude ????\u3e2.1. The amplitude-based proxies include low pass filtered peak displacement (Pd), peak velocity (Pv), and integral of the velocity squared (IV2), whereas the frequency-based proxies include predominant period (????????), characteristic period (????????), and log average period (????????????????). Very few studies have considered areas with lower magnitude events. With an active EEW system in the NMSZ, damage resulting from the catastrophic event, as witnessed in 1811-1812, may be mitigated in real-time

    Validating a neural network-based online adaptive system

    Get PDF
    Neural networks are popular models used for online adaptation to accommodate system faults and recuperate against environmental changes in real-time automation and control applications. However, the adaptivity limits the applicability of conventional verification and validation (V&V) techniques to such systems. We investigated the V&V of neural network-based online adaptive systems and developed a novel validation approach consisting of two important methods. (1) An independent novelty detector at the system input layer detects failure conditions and tracks abnormal events/data that may cause unstable learning behavior. (2) At the system output layer, we perform a validity check on the network predictions to validate its accommodation performance.;Our research focuses on the Intelligent Flight Control System (IFCS) for NASA F-15 aircraft as an example of online adaptive control application. We utilized Support Vector Data Description (SVDD), a one-class classifier to examine the data entering the adaptive component and detect potential failures. We developed a decompose and combine strategy to drastically reduce its computational cost, from O(n 3) down to O( n32 log n) such that the novelty detector becomes feasible in real-time.;We define a confidence measure, the validity index, to validate the predictions of the Dynamic Cell Structure (DCS) network in IFCS. The statistical information is collected during adaptation. The validity index is computed to reflect the trustworthiness associated with each neural network output. The computation of validity index in DCS is straightforward and efficient.;Through experimentation with IFCS, we demonstrate that: (1) the SVDD tool detects system failures accurately and provides validation inferences in a real-time manner; (2) the validity index effectively indicates poor fitting within regions characterized by sparse data and/or inadequate learning. The developed methods can be integrated with available online monitoring tools and further generalized to complete a promising validation framework for neural network based online adaptive systems

    Analysis and interpretation of volcano deformation in Alaska: Studies from Okmok and Mt. Veniaminof volcanoes

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2008Four studies focus on the deformation at Okmok Volcano, the Alaska Peninsula and Mt. Veniaminof. The main focus of the thesis is the volcano deformation at Okmok Volcano and Mt. Veniaminof, but also includes an investigation of the tectonic related compression of the Alaska Peninsula. The complete data set of GPS observations at Okmok Volcano are investigated with the Unscented Kalman Filter time series analysis method. The technique is shown to be useful for inverting geodetic data for time dependent non-linear model parameters. The GPS record at Okmok from 2000 to mid 2007 shows distinct inflation pulses which have several months duration. The inflation is interpreted as magma accumulation in a shallow reservoir under the caldera center and approximately 2.5km below sea level. The location determined for the magma reservoir agrees with estimates determined by other geodetic techniques. Smaller deflation signals in the Okmok record appear following the inflation pulses. A degassing model is proposed to explain the deflation. Petrologic observations from lava erupted in 1997 provide an estimate for the volatile content of the magma. The solution model VolatileCalc is used to determine the amount of volatiles in the gas phase. Degassing can explain the deflation, but only under certain circumstances. The magma chamber must have a radius between ~1and 2km and the intruding magma must have less than approximately 500ppm CO2. At Mt. Veniaminof the deformation signal is dominated by compression caused by the convergence of the Pacific and North American Plates. A subduction model is created to account for the site velocities. A network of GPS benchmarks along the Alaska Peninsula is used to infer the amount of coupling along the mega-thrust. A transition from high to low coupling near the Shumagin Islands has important implications for the seismogenic potential of this section of the fault. The Shumagin segment likely ruptures in more frequent smaller magnitude quakes. The tectonic study provides a useful backdrop to examine the volcano deformation at Mt. Veniaminof. After being corrected for tectonic motion the sites velocities indicate inflation at the volcano. The deformation is interpreted as pressurization occurring beneath the volcano associated with eruptive activity in 2005

    Methods for high-precision subsurface imaging using spatially dense seismic data

    Get PDF
    Current state-of-the-art depth migration techniques are regularly applied in marine seismic exploration, where they deliver accurate and reliable pictures of Earth’s interior. The question is how these algorithms will perform in different environments, not related to oil and gas exploration. For example, how to utilise those techniques in an elusive environment of hard rocks? The main challenge there is to image highly complex, subvertical piece-wise geology, represented by often low reflectivity, in a noisy environment

    Digital Filters for Maintenance Management

    Get PDF
    Faults in mechanisms must be detected quickly and reliably in order to avoid important losses. Detection systems should be developed to minimize maintenance costs and are generally based on consistent models, but as simple as possible. Also, the models for detecting faults must adapt to external and internal conditions to the mechanism. The present chapter deals with three particular maintenance algorithms for turnouts in railway infrastructure by means of discrete filters that comply with these general objectives. All of them have the virtue of being developed within a well-known and common framework, namely the State Space with the help of the Kalman Filter (KF) and/or complementary Fixed Interval Smoother (FIS) algorithms. The algorithms are tested on real applications and thorough results are shown

    O impacto de estruturas pós-deposicionais em reservatórios carbonáticos : exemplos de intrusões ígneas, fraturas e carste

    Get PDF
    Orientador: Alexandre Campane VidalTese (doutorado) - Universidade Estadual de Campinas, Instituto de GeociênciasResumo: A caracterização geológica de reservatórios tem-se tornado cada vez mais desafiante. Com o rápido avanço de processos e métodos para a caracterização de reservatórios também surgem soluções e propostas para os mais variados problemas. Esses problemas e desafios incluem a interpretação e caracterização de estruturas pós-deposicionais para a sua representação através de modelos geológicos tridimensionais. Exemplos dessas estruturas são intrusões ígneas e as estruturas internas e externas associadas, fraturas de escala sub-sísmica, e feições relacionadas a carste. Na seção pós-sal da Bacia de Campos e no pré-sal da Bacia de Santos são aplicadas técnicas para identificar e caracterizar estruturas pós-deposicionais. O principal propósito deste trabalho é contribuir para o aumento do conhecimento em caracterização de reservatórios e para o impacto que as estruturas pós-deposicionais podem ter nos sistemas petrolíferos, particularmente nos reservatórios. Os dados utilizados são constituídos por duas aquisições sísmicas e dois conjuntos de poços das bacias de Campos e Santos. A caracterização de rochas ígneas intrusivas, na Bacia de Campos, se baseou na análise morfométrica que permitiu caracterizar as intrusões, e entender a relação que existe com as estruturas associadas que ocorrem em torno das intrusões. Com base na análise quantitativa das soleiras foi possível inferir os limites em que as geometrias ocorrem, assim como quais as estruturas mais prováveis a serem geradas. Essas estruturas de soleiras magmáticas são caracterizadas por dobras forçadas (estruturas externas), degraus, pontes, e dedos (estruturas internas). Em particular, neste trabalho, concluí-se que provavelmente existem outros fatores importantes para o controlo da formação das dobras forçadas, uma vez que nem todas as vezes que o rácio indicador, a partir do qual a rocha hospedeira é dobrada, está dentro do intervalo de interesse se forma uma dobra forçada. Em relação à caracterização das redes de fraturas e feições relacionadas a carste, na Bacia de Santos, foi construído um modelo de dupla-porosidade, dupla-permeabilidade e utilizou-se uma abordagem de multi-atributos sísmicos para detectar e caracterizar as anomalias de alta amplitude que possivelmente estariam associadas a feições cársticas. Na fase da modelagem, foram distribuídos atributos geométricos de fratura na malha geológica para quantificar a porosidade e permeabilidade equivalente de fratura, e para melhorar a previsão do comportamento estático do reservatório em regiões fraturadas. A fase da modelagem ajudou na estimativa do campo de tensões tectónico local, que também foi importante para a previsão e localização de regiões potencialmente fraturadasAbstract: The geological characterization of reservoirs is becoming very challenging. The fast-paced evolution of processes and methods for the reservoir characterization have been giving rise to different solutions and proposals for the most varied issues in geological modeling. These issues and challenges include the interpretation of post-depositional structures and their representation in three-dimensional geological models. Examples of these structures are igneous intrusions and related internal and external structures, sub-seismic fractures, and karst-related features. In the post-salt of Campos Basin and in the pre-salt of Santos Basin we applied techniques to identify and characterize the post-depositional structures. The main objective of this work is to contribute to the scientific knowledge in the reservoir characterization and to the impact that such structures may have on the petroleum systems, particularly in the reservoirs. The dataset used in this work include two three-dimensional seismic surveys and two sets of borehole data, both from Campos and Santos Basins. The characterization of the intrusive igneous rocks in the Campos Basin was based on the morphometric analysis that allowed the characterization of the intrusions and to understand their relation with the structures occurring within and around the intrusions. Based on the quantitative analysis of sills we were able to infer at which ranges the sill geometries occur in terms of area and length for example. In addition to which external structures are more likely to be generated according to the different sill geometries. Examples of these structures may be, for example, forced folds (external structures), steps, bridges, and fingers (internal structures). In particular, in this work, we found that probably there are other factors that control of forced folds, since not always that the indicator ratio, from which country rock is flexed or folded, are within the interval of interest it forms a forced fold. Regarding the characterization of the fracture networks and karst-related features, in the Santos Basin, we built dual-porosity, dual-permeability model and used a seismic multi-attribute approach to detect and characterize the high-amplitude anomalies possibly associated with karst-related features. In the modeling stage, we distributed the key fracture geometrical attributes in the geological grid, to quantify the fracture equivalent porosity and permeability, and to enhance the prediction of the reservoir static behavior in fractured regions. The modeling stage helped to predict the local tectonic stress field, which is key to predict and locate potential fractured regionsDoutoradoGeologia e Recursos NaturaisDoutor em CiênciasCAPE
    corecore