971 research outputs found

    An Empirical Study of Cohesion and Coupling: Balancing Optimisation and Disruption

    Get PDF
    Search based software engineering has been extensively applied to the problem of finding improved modular structures that maximise cohesion and minimise coupling. However, there has, hitherto, been no longitudinal study of developers’ implementations, over a series of sequential releases. Moreover, results validating whether developers respect the fitness functions are scarce, and the potentially disruptive effect of search-based remodularisation is usually overlooked. We present an empirical study of 233 sequential releases of 10 different systems; the largest empirical study reported in the literature so far, and the first longitudinal study. Our results provide evidence that developers do, indeed, respect the fitness functions used to optimise cohesion/coupling (they are statistically significantly better than arbitrary choices with p << 0.01), yet they also leave considerable room for further improvement (cohesion/coupling can be improved by 25% on average). However, we also report that optimising the structure is highly disruptive (on average more than 57% of the structure must change), while our results reveal that developers tend to avoid such disruption. Therefore, we introduce and evaluate a multi-objective evolutionary approach that minimises disruption while maximising cohesion/coupling improvement. This allows developers to balance reticence to disrupt existing modular structure, against their competing need to improve cohesion and coupling. The multi-objective approach is able to find modular structures that improve the cohesion of developers’ implementations by 22.52%, while causing an acceptably low level of disruption (within that already tolerated by developers)

    Open source environment to define constraints in route planning for GIS-T

    Get PDF
    Route planning for transportation systems is strongly related to shortest path algorithms, an optimization problem extensively studied in the literature. To find the shortest path in a network one usually assigns weights to each branch to represent the difficulty of taking such branch. The weights construct a linear preference function ordering the variety of alternatives from the most to the least attractive.Postprint (published version

    Modeling and optimization of spinning conditions for polyethersulfone hollow fiber membrance fabrication using non-dominated sorting genetic algorithm-II

    Get PDF
    Optimization of spinning conditions plays a key role in the development of high performance asymmetric hollow fiber membranes. However, from previous studies, in solving these spinning condition optimization problems, they were handled mostly by using an experimentation that varied one of the independent spinning conditions and fixed the others. The common problem is the preparation of hollow fiber membranes that cannot be performed effectively due to inappropriate settings of the spinning conditions. Moreover, complexities in the spinning process have increased where the interaction effects between the spinning conditions with the presence of multiple objectives also affect the optimal spinning conditions. This is one of the main reasons why very little work has been carried out to vary spinning conditions simultaneously. Hence, in order to address these issues, this study focused on a non-dominated sorting genetic algorithm-II (NSGA-II) methodology to optimize the spinning conditions during the fabrication of polyethersulfone (PES) ultrafiltration hollow fiber membranes for oily wastewater treatment to maximize flux and rejection. Spinning conditions that were investigated were dope extrusion rate (DER), air gap length (AGL), coagulation bath temperature (CBT), bore fluid ratio (BFR), and post-treatment time (PT). First, the work was focused on predicting the performance of hollow fiber membranes by considering the design of experiments (DOE) and statistical regression technique as an important approach for modeling flux and rejection. In terms of experiments, a response surface methodology (RSM) and a central composite design (CCD) were used, whereby the factorial part was a fractional factorial design with resolution V and overall, it consisted of a combination of high levels and low levels, center points, as well as axial points. Furthermore, the regression models were generated by employing the Design Expert 6.0.5 software and they were found to be significant and valid. Then, the regression models obtained were proposed as the objective functions of NSGA-II to determine the optimal spinning conditions. The MATLAB software was used to code and execute the NSGA-II. With that, a non-dominated solution set was obtained and reported. It was discovered that the optimal spinning conditions occurred at a DER of 2.20 cm3/min, AGL of 0 cm, CBT of 30 °C, BFR (NMP/H2O) of 0/100 wt.%, and PT of 6 hour. In addition, the membrane morphology under the influence of different spinning conditions was investigated via a scanning electron microscope (SEM). The proposed optimization method based on NSGA-II offered an effective way to attain simple but robust solutions, thus providing an efficient production of PES ultrafiltration hollow fiber membranes to be used in oily wastewater treatment. Therefore, the optimization results contributed by NSGA-II can assist engineers and researchers to make better spinning optimization decisions for the membrane fabrication process

    Software restructuring: understanding longitudinal architectural changes and refactoring

    Get PDF
    The complexity of software systems increases as the systems evolve. As the degradation of the system's structure accumulates, maintenance effort and defect-proneness tend to increase. In addition, developers often opt to employ sub-optimal solutions in order to achieve short-time goals, in a phenomenon that has been recently called technical debt. In this context, software restructuring serves as a way to alleviate and/or prevent structural degradation. Restructuring of software is usually performed in either higher or lower levels of granularity, where the first indicates broader changes in the system's structural architecture and the latter indicates refactorings performed to fewer and localised code elements. Although tools to assist architectural changes and refactoring are available, there is still no evidence these approaches are widely adopted by practitioners. Hence, an understanding of how developers perform architectural changes and refactoring in their daily basis and in the context of the software development processes they adopt is necessary. Current software development is iterative and incremental with short cycles of development and release. Thus, tools and processes that enable this development model, such as continuous integration and code review, are widespread among software engineering practitioners. Hence, this thesis investigates how developers perform longitudinal and incremental architectural changes and refactoring during code review through a wide range of empirical studies that consider different moments of the development lifecycle, different approaches, different automated tools and different analysis mechanisms. Finally, the observations and conclusions drawn from these empirical investigations extend the existing knowledge on how developers restructure software systems, in a way that future studies can leverage this knowledge to propose new tools and approaches that better fit developers' working routines and development processes

    A Comprehensive Survey on Particle Swarm Optimization Algorithm and Its Applications

    Get PDF
    Particle swarm optimization (PSO) is a heuristic global optimization method, proposed originally by Kennedy and Eberhart in 1995. It is now one of the most commonly used optimization techniques. This survey presented a comprehensive investigation of PSO. On one hand, we provided advances with PSO, including its modifications (including quantum-behaved PSO, bare-bones PSO, chaotic PSO, and fuzzy PSO), population topology (as fully connected, von Neumann, ring, star, random, etc.), hybridization (with genetic algorithm, simulated annealing, Tabu search, artificial immune system, ant colony algorithm, artificial bee colony, differential evolution, harmonic search, and biogeography-based optimization), extensions (to multiobjective, constrained, discrete, and binary optimization), theoretical analysis (parameter selection and tuning, and convergence analysis), and parallel implementation (in multicore, multiprocessor, GPU, and cloud computing forms). On the other hand, we offered a survey on applications of PSO to the following eight fields: electrical and electronic engineering, automation control systems, communication theory, operations research, mechanical engineering, fuel and energy, medicine, chemistry, and biology. It is hoped that this survey would be beneficial for the researchers studying PSO algorithms

    Synthesis of Probabilistic Models for Quality-of-Service Software Engineering

    Get PDF
    An increasingly used method for the engineering of software systems with strict quality-of-service (QoS) requirements involves the synthesis and verification of probabilistic models for many alternative architectures and instantiations of system parameters. Using manual trial-and-error or simple heuristics for this task often produces suboptimal models, while the exhaustive synthesis of all possible models is typically intractable. The EvoChecker search-based software engineering approach presented in our paper addresses these limitations by employing evolutionary algorithms to automate the model synthesis process and to significantly improve its outcome. EvoChecker can be used to synthesise the Pareto-optimal set of probabilistic models associated with the QoS requirements of a system under design, and to support the selection of a suitable system architecture and configuration. EvoChecker can also be used at runtime, to drive the efficient reconfiguration of a self-adaptive software system. We evaluate EvoChecker on several variants of three systems from different application domains, and show its effectiveness and applicability

    Evolutionary Search Techniques with Strong Heuristics for Multi-Objective Feature Selection in Software Product Lines

    Get PDF
    Software design is a process of trading off competing objectives. If the user objective space is rich, then we should use optimizers that can fully exploit that richness. For example, this study configures software product lines (expressed as feature models) using various search-based software engineering methods. Our main result is that as we increase the number of optimization objectives, the methods in widespread use (e.g. NSGA-II, SPEA2) perform much worse than IBEA (Indicator-Based Evolutionary Algorithm). IBEA works best since it makes most use of user preference knowledge. Hence it does better on the standard measures (hypervolume and spread) but it also generates far more products with 0 violations of domain constraints. We also present significant improvements to IBEA\u27s performance by employing three strong heuristic techniques that we call PUSH, PULL, and seeding. The PUSH technique forces the evolutionary search to respect certain rules and dependencies defined by the feature models, while the PULL technique gives higher weight to constraint satisfaction as an optimization objective and thus achieves a higher percentage of fully-compliant configurations within shorter runtimes. The seeding technique helps in guiding very large feature models to correct configurations very early in the optimization process. Our conclusion is that the methods we apply in search-based software engineering need to be carefully chosen, particularly when studying complex decision spaces with many optimization objectives. Also, we conclude that search methods must be customized to fit the problem at hand. Specifically, the evolutionary search must respect domain constraints

    Assessing the effectiveness of managed lane strategies for the rapid deployment of cooperative adaptive cruise control technology

    Get PDF
    Connected and Automated Vehicle (C/AV) technologies are fast expanding in the transportation and automotive markets. One of the highly researched examples of C/AV technologies is the Cooperative Adaptive Cruise Control (CACC) system, which exploits various vehicular sensors and vehicle-to-vehicle communication to automate vehicular longitudinal control. The operational strategies and network-level impacts of CACC have not been thoroughly discussed, especially in near-term deployment scenarios where Market Penetration Rate (MPR) is relatively low. Therefore, this study aims to assess CACC\u27s impacts with a combination of managed lane strategies to provide insights for CACC deployment. The proposed simulation framework incorporates 1) the Enhanced Intelligent Driver Model; 2) Nakagami-based radio propagation model; and 3) a multi-objective optimization (MOOP)-based CACC control algorithm. The operational impacts of CACC are assessed under four managed lane strategies (i.e., mixed traffic (UML), HOV (High Occupancy Vehicle)-CACC lane (MML), CACC dedicated lane (DL), and CACC dedicated lane with access control (DLA)). Simulation results show that the introduction of CACC, even with 10% MPR, is able to improve the network throughput by 7% in the absence of any managed lane strategies. The segment travel times for both CACC and non-CACC vehicles are reduced. The break-even point for implementing dedicated CACC lane is 30% MPR, below which the priority usage of the current HOV lane for CACC traffic is found to be more appropriate. It is also observed that DLA strategy is able to consistently increase the percentage of platooned CACC vehicles as MPR grows. The percentage of CACC vehicles within a platoon reaches 52% and 46% for DL and DLA, respectively. When it comes to the impact of vehicle-to-vehicle (V2V), it is found that DLA strategy provides more consistent transmission density in terms of median and variance when MPR reaches 20% or above. Moreover, the performance of the MOOP-based cooperative driving is examined. With average 75% likelihood of obtaining a feasible solution, the MOOP outperforms its counterpart which aims to minimize the headway objective solely. In UML, MML, and DL strategy, the proposed control algorithm achieves a balance spread among four objectives for each CACC vehicle. In the DLA strategy, however, the probability of obtaining feasible solution falls to 60% due to increasing size of platoon owing to DLA that constraints the feasible region by introduction more dimensions in the search space. In summary, UML or MML is the preferred managed lane strategy for improving traffic performance when MPR is less than 30%. When MRP reaches 30% or above, DL and DLA could improve the CACC performance by facilitating platoon formation. If available, priority access to an existing HOV lane can be adopted to encourage adaptation of CACC when CACC technology becomes publically available
    • …
    corecore