2,763 research outputs found

    Modelling Open-Source Software Reliability Incorporating Swarm Intelligence-Based Techniques

    Full text link
    In the software industry, two software engineering development best practices coexist: open-source and closed-source software. The former has a shared code that anyone can contribute, whereas the latter has a proprietary code that only the owner can access. Software reliability is crucial in the industry when a new product or update is released. Applying meta-heuristic optimization algorithms for closed-source software reliability prediction has produced significant and accurate results. Now, open-source software dominates the landscape of cloud-based systems. Therefore, providing results on open-source software reliability - as a quality indicator - would greatly help solve the open-source software reliability growth-modelling problem. The reliability is predicted by estimating the parameters of the software reliability models. As software reliability models are inherently nonlinear, traditional approaches make estimating the appropriate parameters difficult and ineffective. Consequently, software reliability models necessitate a high-quality parameter estimation technique. These objectives dictate the exploration of potential applications of meta-heuristic swarm intelligence optimization algorithms for optimizing the parameter estimation of nonhomogeneous Poisson process-based open-source software reliability modelling. The optimization algorithms are firefly, social spider, artificial bee colony, grey wolf, particle swarm, moth flame, and whale. The applicability and performance evaluation of the optimization modelling approach is demonstrated through two real open-source software reliability datasets. The results are promising.Comment: 14 pages, 11 figures, 7 table

    Hybrid Software Reliability Model for Big Fault Data and Selection of Best Optimizer Using an Estimation Accuracy Function

    Get PDF
    Software reliability analysis has come to the forefront of academia as software applications have grown in size and complexity. Traditionally, methods have focused on minimizing coding errors to guarantee analytic tractability. This causes the estimations to be overly optimistic when using these models. However, it is important to take into account non-software factors, such as human error and hardware failure, in addition to software faults to get reliable estimations. In this research, we examine how big data systems' peculiarities and the need for specialized hardware led to the creation of a hybrid model. We used statistical and soft computing approaches to determine values for the model's parameters, and we explored five criteria values in an effort to identify the most useful method of parameter evaluation for big data systems. For this purpose, we conduct a case study analysis of software failure data from four actual projects. In order to do a comparison, we used the precision of the estimation function for the results. Particle swarm optimization was shown to be the most effective optimization method for the hybrid model constructed with the use of large-scale fault data

    Communication Subsystems for Emerging Wireless Technologies

    Get PDF
    The paper describes a multi-disciplinary design of modern communication systems. The design starts with the analysis of a system in order to define requirements on its individual components. The design exploits proper models of communication channels to adapt the systems to expected transmission conditions. Input filtering of signals both in the frequency domain and in the spatial domain is ensured by a properly designed antenna. Further signal processing (amplification and further filtering) is done by electronics circuits. Finally, signal processing techniques are applied to yield information about current properties of frequency spectrum and to distribute the transmission over free subcarrier channels

    Three-Stage Adjusted Regression Forecasting (TSARF) for Software Defect Prediction

    Full text link
    Software reliability growth models (SRGM) enable failure data collected during testing. Specifically, nonhomogeneous Poisson process (NHPP) SRGM are the most commonly employed models. While software reliability growth models are important, efficient modeling of complex software systems increases the complexity of models. Increased model complexity presents a challenge in identifying robust and computationally efficient algorithms to identify model parameters and reduces the generalizability of the models. Existing studies on traditional software reliability growth models suggest that NHPP models characterize defect data as a smooth continuous curve and fail to capture changes in the defect discovery process. Therefore, the model fits well under ideal conditions, but it is not adaptable and will only fit appropriately shaped data. Neural networks and other machine learning methods have been applied to greater effect [5], however limited due to lack of large samples of defect data especially at earlier stages of testing

    Comparison of Metaheuristic Optimization Algorithms for Electromechanical Actuator Fault Detection

    Get PDF
    Model-based Fault Detection and Identification (FDI) for prognostics rely on the comparison between the response of the monitored system and that of a digital twin. Discrepancies among the behavior of the two systems are analyzed to filter out the effect of uncertainties of the model and identify failure precursors. A possible solution to identify faults is to leverage a model able to simulate faults: an optimization algorithm varies the faults magnitude parameters within the model to achieve the matching between the responses of the model and the actual system. When the algorithm converges, we can assume that the fault parameters that produce the best match between the system and its digital twin approximate the actual faults affecting the equipment. The choice of an optimization algorithm appropriate for the task is highly problem dependent. Algorithms for FDI are required to deal with multimodal objective functions characterized by poor regularity and a relatively high computational cost. Additionally, the derivatives of the objective function are not usually available and must be obtained numerically if needed. Then, we restrict our search for a suitable optimization algorithm to metaheuristic gradient-free ones, testing Genetic Algorithm, Particle Swarm Optimization, Differential Evolution, Grey Wolf Optimization, Dragonfly Algorithm, and Whale Optimization Algorithm. Their performances on the considered problem were assessed and compared, in terms of accuracy and computational time

    Optimal overcurrent relay coordination in wind farm using genetic algorithm

    Get PDF
    Wind farms are ones of the most indispensable types of sustainable energies which are progressively engaged in smart grids with tenacity of electrical power generation predominantly as a distribution generation system. Thus, rigorous protection of wind power plants is an immensely momentous aspect in electrical power protection engineering which must be contemplated thoroughly during designing the wind plants to afford a proper protection for power components in case of fault occurrence. The most commodious protection apparatus are overcurrent relays (OCRs) which are responsible for protecting power systems from impending faults. In order to employ a prosperous and proper protection for wind farms, these relays must be set precisely and well-coordinated with each other to clear the faults at the system in the shortest possible time. These relays are set and coordinated with each other by applying IEEE or IEC standards methods, however, their operation times are relatively long and the coordination between these relays are not optimal. The other common problem in these power systems is when a fault occurs in a plant, several OCRs operate instead of a designated relay to that particular fault location. This, if undesirable can result in unnecessary power loss and disconnection of healthy feeders out of the plant which is extremely dire. It is necessary to address the problems related inefficient coordination of OCRs. Many suggestions have been made and approaches implemented, however one of the most prominent methods is the use of Genetic Algorithm (GA) to improve the function and coordination of OCRs. GA optimization technique was implemented in this project due to its ample advantages over other AI techniques including proving high accuracy, fast response and most importantly obtaining optimal solutions for nonlinear characteristics of OCRs. In addressing the mentioned problems, the main objective of this research is to improve the protection of wind farms by optimizing the relay settings, reducing their operation time, Time Setting Multiplier (TSM) of each relay, improving the coordination between relays after implementation of IEC 60255-151:2009 standard. The most recent and successful OF for GA technique has been used, unique parameters for GA was selected for this research to significantly improve the protection for wind farms that is highly better compared to any research accomplished before for the purpose of wind farm protection. GA was used to obtain improved values for each relay settings based on their coordination criteria. Each relay operation time and TSM are optimized which would contribute to provide a better protection for wind farm. Thus, the objective of this work which is improving the protection of wind farms by optimizing the relay settings, reducing their operation time, Time Setting Multiplier (TSM) of each relay, improving the coordination between relays, have been successfully fulfilled and solved the problems associated with wind farm relay protection system settings. The new approach has shown significant improvement in operation of OCRs at the wind farm, have drastically reduced the accumulative operation time of the relays by 26.8735% (3.7623 seconds)
    corecore