26 research outputs found

    Developing an agent-based simulation model of software evolution

    Get PDF
    Context In attempt to simulate the factors that affect the software evolution behaviour and possibly predict it, several simulation models have been developed recently. The current system dynamic (SD) simulation model of software evolution process was built based on actor-network theory (ANT) of software evolution by using system dynamic environment, which is not a suitable environment to reflect the complexity of ANT theory. In addition the SD model has not been investigated for its ability to represent the real-world process of software evolution. Objectives This paper aims to re-implements the current SD model to an agent-based simulation environment ‘Repast’ and checks the behaviour of the new model compared to the existing SD model. It also aims to investigate the ability of the new Repast model to represent the real-world process of software evolution. Methods a new agent-based simulation model is developed based on the current SD model's specifications and then tests similar to the previous model tests are conducted in order to perform a comparative evaluation between of these two results. In addition an investigation is carried out through an interview with an expert in software development area to investigate the model's ability to represent real-world process of software evolution. Results The Repast model shows more stable behaviour compared with the SD model. Results also found that the evolution health of the software can be calibrated quantitatively and that the new Repast model does have the ability to represent real-world processes of software evolution. Conclusion It is concluded that by applying a more suitable simulation environment (agent-based) to represent ANT theory of software evolution, that this new simulation model will show more stable bahaviour compared with the previous SD model; And it will also shows the ability to represent (at least quantatively) the real-world aspect of software evolution.Peer reviewedFinal Accepted Versio

    Complex low volume electronics simulation tool to improve yield and reliability

    Get PDF
    Assembly of Printed Circuit Boards (PCB) in low volumes and a high-mix requires a level of manual intervention during product manufacture, which leads to poor first time yield and increased production costs. Failures at the component-level and failures that stem from non-component causes (i.e. system-level), such as defects in design and manufacturing, can account for this poor yield. These factors have not been incorporated in prediction models due to the fact that systemfailure causes are not driven by well-characterised deterministic processes. A simulation and analysis support tool being developed that is based on a suite of interacting modular components with well defined functionalities and interfaces is presented in this paper. The CLOVES (Complex Low Volume Electronics Simulation) tool enables the characterisation and dynamic simulation of complete design; manufacturing and business processes (throughout the entire product life cycle) in terms of their propensity to create defects that could cause product failure. Details of this system and how it is being developed to fulfill changing business needs is presented in this paper. Using historical data and knowledge of previous printed circuit assemblies (PCA) design specifications and manufacturing experiences, defect and yield results can be effectively stored and re-applied for future problem solving. For example, past PCA design specifications can be used at design stage to amend designs or define process options to optimise the product yield and service reliability

    Where do they go when they die?

    Get PDF
    Food webs and matrices are vital to understanding feeding relationships and ecology. Adjacency matrices can be employed to present the direct relationships between predators and prey; these binary matrices utilize 0’s to denote no direct link and 1’s to denote a direct link. We analyzed a variety of published food webs ranging from pine forests in the United States to tussock grasslands in New Zealand. The food webs varied in number of distinguishable taxa present, functional diversity, climates and habitats. Consequently, we expect that our results are not specific to a given system. The published food webs lack flows from organisms to detritus despite the fact that organisms in these webs consume detritus. This discrepancy leads us to question how the inclusion of flows to detritus influences indirect connectance within large food webs. By including the flows to detritus, the number of indirect paths of length n as well as indirect relationships throughout the systems increased. Null model simulations were compared to detrital models in power series and eigen analysis. Pathway proliferation was found in all simulations with detrital models exhibiting greater potential indirect paths and detritus contributing greatly to energetic cycling by serving as energy storage to dead and decaying organic matter in ecosystems

    A comparison of statistical emulation methodologies for multi-wave calibration of environmental models

    Get PDF
    Expensive computer codes, particularly those used simulating environmental or geological processes such as climate models, require calibration (sometimes called tuning). When calibrating expensive simulators using uncertainty quantification methods, it is usually necessary to use a statistical model called an emulator in place of the computer code when running the calibration algorithm. Though emulators based on Gaussian processes are typically many orders of magnitude faster to evaluate than the simulator they mimic, many applications have sought to speed up the computations by using regression-only emulators within the calculations instead, arguing that the extra sophistication brought using the Gaussian process is not worth the extra computational power. This was the case for the analysis that produced the UK climate projections in 2009. In this paper we compare the effectiveness of both emulation approaches upon a multi-wave calibration framework that is becoming popular in the climate modelling community called \history matching". We find that Gaussian processes offer significant benefits to the reduction of parametric uncertainty over regression-only approaches. We find that in a multi-wave experiment, a combination of regression-only emulators initially, followed by Gaussian process emulators for refocussing experiments can be nearly as effective as using Gaussian processes throughout for a fraction of the computational cost. We also discover a number of design and emulator-dependent features of the multi-wave history matching approach that can cause apparent, yet premature, convergence of our estimates of parametric uncertainty. We compare these approaches to calibration in idealised examples and apply it to a well-known geological reservoir mode

    Regionalized environmental impacts of construction machinery

    Get PDF
    Purpose: This study aims to establish a regionalized environmental impact assessment of construction machinery equipped with diesel engines certified by the European emission standard Stage V, and operated in cold climatic zones in Europe. Method: The study quantifies potential environmental impacts associated with construction machinery over the entire lifecycle, from extraction of materials to the end-of-life. For the operation phase, a meso-level emission accounting method is applied to quantify tailpipe emissions for certain subcategories of construction machinery. This is achieved by determining the operational efficiency of each machine in terms of effective hours. The quantified emission data are then adjusted based on engine deterioration models to estimate the rate of increase in emissions throughout the lifetime of each machine. Finally, the CML impact assessment method is applied to inventory data to quantify potential environmental impacts. Results: The study shows that tailpipe emissions, which largely depend on an engine’s fuel consumption, had the largest contribution to environmental impacts in most impact categories. At the same time, there was a positive correlation between the operation weight and the impacts of the machinery. Also, machinery with similar operation weight had relatively similar impact patterns due to similar driving factors and dependencies. In addition, network, sensitivity, and uncertainty analyses were performed to quantify the source of impacts and validate the robustness of the study. Results of the sensitivity analysis showed that the responsiveness of the studied systems is very sensitive to changes in the amount of fuel consumption. In addition, the uncertainty results showed that the domain of uncertainty increased as the operation weight subcategory of machinery increased. Conclusion: This study extends previous work on the life cycle assessment (LCA) of construction machinery, and the methodology developed provides a basis for future extension and improvement in this field. The use of effective hours as the unit of operational efficiency helps to resolve uncertainties linked to lifetime and annual operation hours. Also, the obtained results can be of use for decision support and for assessing the impacts of transition from fossil fuels to alternative fuel types

    Rethinking the Value of Simulation Methods in the Information Systems Research Field: A Call for Reconstructing Contribution for a Broader Audience

    Get PDF
    The impact of simulation methods for social research in the Information Systems (IS) research field remains low. A concern is our field is inadequately leveraging the unique strengths of simulation methods. Although this low impact is frequently attributed to methodological complexity, we offer an alternative explanation – the poor construction of research value. We argue more intuitive value construction, better connected to the knowledge base, will facilitate increased value and broader appreciation. Meta-analysis of studies published in IS journals over the last decade evidences the low impact. To facilitate value construction, we synthesize four common types of simulation research contribution: Analyzer, Tester, Descriptor, and Theorizer. To illustrate, we employ the proposed typology to describe how each type of value is structured in simulation research and connect each type to instances from IS literature, thereby making these value types and their construction visible and readily accessible to the general IS community

    Development of a customised design flood estimation tool to estimate floods in gauged and ungauged catchments

    Get PDF
    Published ArticleThe estimation of design flood events, i.e., floods characterised by a specific magnitude-frequency relationship, at a particular site in a specific region is necessary for the planning, design and operation of hydraulic structures. Both the occurrence and frequency of flood events, along with the uncertainty involved in the estimation thereof, contribute to the practising engineers’ dilemma to make a single, justifiable decision based on the results obtained from the plethora of ‘outdated’ design flood estimation methods available in South Africa. The objectives of this study were: (i) to review the methods currently used for design flood estimation in South Africa for single-site analysis, (ii) to develop a customised, user-friendly Design Flood Estimation Tool (DFET) containing the latest design rainfall information and recognised estimation methods used in South African flood hydrology, and (iii) to demonstrate the use and functionality of the developed DFET by comparing and assessing the performance of the various design flood estimation methods in gauged catchments with areas ranging from 100 km² to 10 000 km² in the C5 secondary drainage region, South Africa. The results showed that the developed DFET will provide designers with an easy-to-use software tool for the rapid estimation and evaluation of alternative design flood estimation methods currently available in South Africa for applications at a site-specific scale in both gauged/ungauged and small/large catchments. In applying the developed DFET to gauged catchments, the simplified ‘small catchment’ (A ≤ 15 km²) deterministic flood estimation methods provided acceptable results when compared to the probabilistic analyses applicable to all of the catchment sizes and return periods, except for the 2-year return period. Less acceptable results were demonstrated by the ‘medium catchment’ (15 km² 5 000 km²) empirical flood estimation methods. It can be concluded that there is no single design flood estimation method that is superior to all other methods used to address the wide variety of flood magnitude-frequency problems that are encountered in practice. Practising engineers’ still have to apply their own experience and knowledge to these particular problems until the gap between flood research and practice in South Africa is narrowed by improving existing (outdated) design flood estimation methods and/or evaluating methods used internationally and developing new methods for application in South Africa

    Development of a customised design flood estimation tool to estimate floods in gauged and ungauged catchments

    Get PDF
    Published ArticleThe estimation of design flood events, i.e., floods characterised by a specific magnitude-frequency relationship, at a particular site in a specific region is necessary for the planning, design and operation of hydraulic structures. Both the occurrence and frequency of flood events, along with the uncertainty involved in the estimation thereof, contribute to the practising engineers’ dilemma to make a single, justifiable decision based on the results obtained from the plethora of ‘outdated’ design flood estimation methods available in South Africa. The objectives of this study were: (i) to review the methods currently used for design flood estimation in South Africa for single-site analysis, (ii) to develop a customised, user-friendly Design Flood Estimation Tool (DFET) containing the latest design rainfall information and recognised estimation methods used in South African flood hydrology, and (iii) to demonstrate the use and functionality of the developed DFET by comparing and assessing the performance of the various design flood estimation methods in gauged catchments with areas ranging from 100 km² to 10 000 km² in the C5 secondary drainage region, South Africa. The results showed that the developed DFET will provide designers with an easy-to-use software tool for the rapid estimation and evaluation of alternative design flood estimation methods currently available in South Africa for applications at a site-specific scale in both gauged/ungauged and small/large catchments. In applying the developed DFET to gauged catchments, the simplified ‘small catchment’ (A ≤ 15 km²) deterministic flood estimation methods provided acceptable results when compared to the probabilistic analyses applicable to all of the catchment sizes and return periods, except for the 2-year return period. Less acceptable results were demonstrated by the ‘medium catchment’ (15 km² 5 000 km²) empirical flood estimation methods. It can be concluded that there is no single design flood estimation method that is superior to all other methods used to address the wide variety of flood magnitude-frequency problems that are encountered in practice. Practising engineers’ still have to apply their own experience and knowledge to these particular problems until the gap between flood research and practice in South Africa is narrowed by improving existing (outdated) design flood estimation methods and/or evaluating methods used internationally and developing new methods for application in South Africa

    Building a finite state automaton for physical processes using queries and counterexamples on long short-term memory models

    Get PDF
    Most neural networks (NN) are commonly used as black-box functions. A network takes an input and produces an output, without the user knowing what rules and system dynamics have produced the specific output. In some situations, such as safety-critical applications, having the capability of understanding and validating models before applying them can be crucial. In this regard, some approaches for representing NN in more understandable ways, attempt to accurately extract symbolic knowledge from the networks using interpretable and simple systems consisting of a finite set of states and transitions known as deterministic finite-state automata (DFA). In this thesis, we have considered a rule extraction approach developed by Weiss et al. that employs the exact learning method L* to extract DFA from recurrent neural networks (RNNs) trained on classifying symbolic data sequences. Our aim has been to study the practicality of applying their rule extraction approach on more complex data based on physical processes consisting of continuous values. Specifically, we experimented with datasets of varying complexities, considering both the inherent complexity of the dataset itself and complexities introduced from different discretization intervals used to represent the continuous data values. Datasets incorporated in this thesis encompass sine wave prediction datasets, sequence value prediction datasets, and a safety-critical well-drilling pressure scenario generated through the use of the well-drilling simulator OpenLab and the sparse identification of nonlinear dynamical systems (SINDy) algorithm. We observe that the rule extraction algorithm is able to extract simple and small DFA representations of LSTM models. On the considered datasets, extracted DFA generally demonstrates worse performance than the LSTM models used for extraction. Overall, for both increasing problem complexity and more discretization intervals, the performance of the extracted DFA decreases. However, DFA extracted from datasets discretized using few intervals yields more impressive results, and the algorithm can in some cases extract DFA that outperforms their respective LSTM models.Masteroppgave i informatikkINF399MAMN-INFMAMN-PRO
    corecore