4,118 research outputs found

    Assessing Software Reliability Using Modified Genetic Algorithm: Inflection S-Shaped Model

    Get PDF
    In order to assess software reliability, many software reliability growth models (SRGMs) have been proposed in the past four decades. In principle, two widely used methods for the parameter estimation of SRGMs are the maximum likelihood estimation (MLE) and the least squares estimation (LSE). However, the approach of these two estimations may impose some restrictions on SRGMs, such as the existence of derivatives from formulated models or the needs for complex calculation. In this paper, we propose a modified genetic algorithm (MGA) to assess the reliability of software considering the Time domain software failure data using Inflection S-shaped model which is NonHomogenous Poisson Process (NHPP) based. Experiments based on real software failure data are performed, and the results show that the proposed genetic algorithm is more effective and faster than traditional algorithms

    Optimization of habitat suitability models for freshwater species distribution using evolutionary algorithms

    Get PDF

    Internationales Kolloquium über Anwendungen der Informatik und Mathematik in Architektur und Bauwesen : 20. bis 22.7. 2015, Bauhaus-Universität Weimar

    Get PDF
    The 20th International Conference on the Applications of Computer Science and Mathematics in Architecture and Civil Engineering will be held at the Bauhaus University Weimar from 20th till 22nd July 2015. Architects, computer scientists, mathematicians, and engineers from all over the world will meet in Weimar for an interdisciplinary exchange of experiences, to report on their results in research, development and practice and to discuss. The conference covers a broad range of research areas: numerical analysis, function theoretic methods, partial differential equations, continuum mechanics, engineering applications, coupled problems, computer sciences, and related topics. Several plenary lectures in aforementioned areas will take place during the conference. We invite architects, engineers, designers, computer scientists, mathematicians, planners, project managers, and software developers from business, science and research to participate in the conference

    Search based software engineering: Trends, techniques and applications

    Get PDF
    © ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives. This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E

    Hybrid Software Reliability Model for Big Fault Data and Selection of Best Optimizer Using an Estimation Accuracy Function

    Get PDF
    Software reliability analysis has come to the forefront of academia as software applications have grown in size and complexity. Traditionally, methods have focused on minimizing coding errors to guarantee analytic tractability. This causes the estimations to be overly optimistic when using these models. However, it is important to take into account non-software factors, such as human error and hardware failure, in addition to software faults to get reliable estimations. In this research, we examine how big data systems' peculiarities and the need for specialized hardware led to the creation of a hybrid model. We used statistical and soft computing approaches to determine values for the model's parameters, and we explored five criteria values in an effort to identify the most useful method of parameter evaluation for big data systems. For this purpose, we conduct a case study analysis of software failure data from four actual projects. In order to do a comparison, we used the precision of the estimation function for the results. Particle swarm optimization was shown to be the most effective optimization method for the hybrid model constructed with the use of large-scale fault data

    Design of a comprehensive modeling, characterization, rupture risk assessment and visualization pipeline for Abdominal Aortic Aneurysms

    Get PDF
    Abdominal aortic aneurysms (AAA) is a dilation of the abdominal aorta, typically within the infra-renal segment of the vessel that cause an expansion of at least 1.5 times the normal vessel diameter. It is becoming a leading cause of death in the United States and around the world, and consequentially, in 2009, the Society for Vascular Surgery (SVS) practice guidelines expressed the critical need to further investigate the factors associated with the risk of AAA rupture, along with potential treatment methods. For decades, the maximum diameter (Dmax) was introduced as the main parameter used to assess AAA behavior and its rupture risk. However, it has been shown that three main categories of parameters including geometrical indices, such as the maximum transverse diameter, biomechanical parameters, such as material properties, and historical clinical parameters, such as age, gender, hereditary history and life-style affect AAA and its rupture risk. Therefore, despite all efforts that have been undertaken to study the relationship among different parameters affecting AAA and its rupture, there are still limitations that require further investigation and modeling; the challenges associated with the traditional, clinical quality images represent one class of these limitations. The other limitation is the use of the homogenous hyper-elastic material property model to study the entire AAA, when, in fact, there is evidence that different degrees of degradation of the elastin and collagen network of the AAA wall lead to different regions of the AAA exhibiting different material properties, which, in turn, affect its biomechanical behavior and rupture. Moreover, the effects of all three main categories of parameters need to be considered simultaneously and collectively when studying the AAAs and their rupture, so once again, the field can further benefit from such studies. Therefore, in this work, we describe a comprehensive pipeline consisting of three main components to overcome some of these existing limitations. The first component of the proposed method focuses on the reconstruction and analysis of both synthetic and human subject-specific 3D models of AAA, accompanied by a full geometric parameter analysis and their effects on wall stress and peak wall stress. The second component investigates the effect of various biomechanical parameters, specifically the use of various homogeneous and heterogeneous material properties to model the behavior of the AAA wall. To this extent, we introduce two different patient-specific regional material property models to better mimic the physiological behavior of the AAA wall. Finally, the third component utilizes machine learning methods to develop a comprehensive predictive model that incorporates the effect of the geometrical, biomechanical and historical clinical data to predict the rupture severity of AAA in a patient-specific manner. This is the first comprehensive semi-automated method developed for the assessment of AAA. Our findings illustrate that using a regional material property model that mimics the realistic heterogeneity of the vessel’s wall leads to more reliable and accurate predictions of AAA severity and associated rupture risk. Additionally, our results indicate that using only Dmax as an indicator for the rupture risk is insufficient, while a combination of parameters from different sources along with PWS could serve as a more reliable rupture assessment. These methods can help better characterize the severity of AAAs, better predict their associated rupture risk, and, in turn, help clinicians with earlier, patient-customized diagnosis and patient-customized treatment planning approaches, such as stent grafting

    Evaluation of machine learning algorithms as predictive tools in road safety analysis

    Get PDF
    The Highway Safety Manual (HSM)’s road safety management process (RSMP) represents the state-of-the-practice procedure that transportation professionals employ to monitor and improve safety on existing roadway sites. RSMP requires the development of safety performance functions (SPFs), which are the key regression tools in the Highway Safety Manual’s RSMP used to predict crash frequency given a set of roadway and traffic factors. Although developing SPFs using traditional regression modeling have been proven to be reliable tools for road safety predictive analytics, some limitations and constraints have been highlighted in the literature, such as the assumption of a probability distribution, selection of a pre-defined functional form, a possible correlation between independent variables, and possible transferability issues. An alternative to traditional regression models as predictive tools is the use of Machine Learning (ML) algorithms. Although ML provides a new modeling technique, it still has made-in assumptions and their performance in collision frequency modeling needs to be studied. This research 1) compares the prediction performance of three well-known ML algorithms, i.e., Support Vector Machine (SVM), Decision Tree (DT), and Random Forest (RF), to traditional SPFs, 2) conducts sensitivity analysis and compare ML with the functional form of the negative binomial (NB) model as default traditional regression modeling technique, and 3) applies and validates ML algorithms in network screening (hotspot identification), which is the first step in the RSMP. To achieve these objectives, a dataset of urban signalized and unsignalized intersections from two major municipalities in Saskatchewan (Canada) were considered as a case study. The results showed that the ML prediction accuracies are comparable with that of the NB model. Moreover, the sensitivity analysis proved that ML algorithms predictions are mostly affected by changes in traffic volume, rather than other roadway factors. Lastly, the ML-based measure consistency in identifying hotspots appeared to be comparable to SPF-based measures, e.g., the excess (predicted and expected) average crash frequency. Overall, the results of this research support the use of ML as a predictive tool in network screening, which provides transportation practitioners with an alternative modeling approach to identify collision-prone locations where countermeasures aimed at reducing collision frequency at urban intersections can be installed
    corecore