17,763 research outputs found
Mining Frequent Itemsets Using Genetic Algorithm
In general frequent itemsets are generated from large data sets by applying
association rule mining algorithms like Apriori, Partition, Pincer-Search,
Incremental, Border algorithm etc., which take too much computer time to
compute all the frequent itemsets. By using Genetic Algorithm (GA) we can
improve the scenario. The major advantage of using GA in the discovery of
frequent itemsets is that they perform global search and its time complexity is
less compared to other algorithms as the genetic algorithm is based on the
greedy approach. The main aim of this paper is to find all the frequent
itemsets from given data sets using genetic algorithm
Optimisation of Mobile Communication Networks - OMCO NET
The mini conference “Optimisation of Mobile Communication Networks” focuses on advanced methods for search and optimisation applied to wireless communication networks. It is sponsored by Research & Enterprise Fund Southampton Solent University.
The conference strives to widen knowledge on advanced search methods capable of optimisation of wireless communications networks. The aim is to provide a forum for exchange of recent knowledge, new ideas and trends in this progressive and challenging area. The conference will popularise new successful approaches on resolving hard tasks such as minimisation of transmit power, cooperative and optimal routing
A Multi-Level Framework for the Detection, Prioritization and Testing of Software Design Defects
Large-scale software systems exhibit high complexity and become difficult to maintain. In fact, it has been reported that software cost dedicated to maintenance and evolution activities is more
than 80% of the total software costs. In particular, object-oriented software systems need to
follow some traditional design principles such as data abstraction, encapsulation, and modularity.
However, some of these non-functional requirements can be violated by developers for many
reasons such as inexperience with object-oriented design principles, deadline stress. This high
cost of maintenance activities could potentially be greatly reduced by providing automatic or
semi-automatic solutions to increase system‟s comprehensibility, adaptability and extensibility to
avoid bad-practices.
The detection of refactoring opportunities focuses on the detection of bad smells, also called
antipatterns, which have been recognized as the design situations that may cause software
failures indirectly. The correction of one bad smell may influence other bad smells. Thus, the
order of fixing bad smells is important to reduce the effort and maximize the refactoring benefits.
However, very few studies addressed the problem of finding the optimal sequence in which the
refactoring opportunities, such as bad smells, should be ordered. Few other studies tried to
prioritize refactoring opportunities based on the types of bad smells to determine their severity.
However, the correction of severe bad smells may require a high effort which should be
optimized and the relationships between the different bad smells are not considered during the
prioritization process.
The main goal of this research is to help software engineers to refactor large-scale systems with a
minimum effort and few interactions including the detection, management and testing of
refactoring opportunities. We report the results of an empirical study with an implementation of
our bi-level approach. The obtained results provide evidence to support the claim that our
proposal is more efficient, on average, than existing techniques based on a benchmark of 9 open
source systems and 1 industrial project. We have also evaluated the relevance and usefulness of
the proposed bi-level framework for software engineers to improve the quality of their systems
and support the detection of transformation errors by generating efficient test cases.Ph.D.Information Systems Engineering, College of Engineering and Computer ScienceUniversity of Michigan-Dearbornhttp://deepblue.lib.umich.edu/bitstream/2027.42/136075/1/Dilan_Sahin_Final Dissertation.pdfDescription of Dilan_Sahin_Final Dissertation.pdf : Dissertatio
A multi-objective routing strategy for QoS and energy awareness in software-defined networks
“© © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. http://ieeexplore.ieee.org/document/8013750/”Energy consumption is a key concern in the deployment and operation of current data networks, for which Software-Defined Networks (SDN) have become a promising alternative. Although several works have been proposed to improve the energy efficiency, these techniques may lead to performance degradations when QoS requirements are neglected. Inspired by this problem, this letter introduces a new routing strategy, jointly considering QoS requirements and energy awareness in SDN with in-band control traffic. To that end, we present a complete formulation of the optimization problem and implement a Multi- Objective Evolutionary Algorithm. Simulation results validate the performance improvement on critical network parameters.Peer ReviewedPostprint (published version
Genetic Land - Modeling land use change using evolutionary algorithms
Future land use configurations provide valuable knowledge for policy makers and economic agents, especially under expected environmental changes such as decreasing rainfall or increasing temperatures, or scenarios of policy guidance such as carbon sequestration enforcement. In this paper, modelling land use change is designed as an optimization problem in which landscapes (land uses) are generated through the use of genetic algorithms (GA), according to an objective function (e.g. minimization of soil erosion, or maximization of carbon sequestration), and a set of local restrictions (e.g. soil depth, water availability, or landscape structure). GAs are search and optimization procedures based on the mechanics of natural selection and genetics. The GA starts with a population of random individuals, each corresponding to a particular candidate solution to the problem. The best solutions are propagated; they are mated with each other and originate “offspring solutions” which randomly combine the characteristics of each “parent”. The repeated application of these operations leads to a dynamic system that emulates the evolutionary mechanisms that occur in nature. The fittest individuals survive and propagate their traits to future generations, while unfit individuals have a tendency to die and become extinct (Goldberg, 1989). Applications of GA to land use planning have been experimented (Brookes, 2001, Ducheyne et al, 2001). However, long-term planning with a time-span component has not yet been addressed. GeneticLand, the GA for land use generation, works on a region represented by a bi-dimensional array of cells. For each cell, there is a number of possible land uses (U1, U2, ..., Un). The task of the GA is to search for an optimal assignment of these land uses to the cells, evolving the landscape patterns that are most suitable for satisfying the objective function, for a certain time period (e.g. 50 years in the future). GeneticLand develops under a multi-objective function: (i) Minimization of soil erosion – each solution is validated by applying the USLE, with the best solution being the one that minimizes the landscape soil erosion value; (ii) Maximization of carbon sequestration – each solution is validated by applying atmospheric CO2 carbon uptake estimates, with the best solution being the one that maximizes the landscape carbon uptake; and (iii) Maximization of the landscape economic value – each solution is validated by applying an economic value (derived from expert judgment), with the best solution being the one that maximizes the landscape economic value. As an optimization problem, not all possible land use assignments are feasible. GeneticLand considers two sets of restrictions that must be met: (i) physical constraints (soil type suitability, slope, rainfall-evapotranspiration ratio, and a soil wetness index) and (ii) landscape ecology restrictions at several levels (minimum patch area, land use adjacency index and landscape contagion index). The former assures physical feasibility and the latter the spatial coherence of the landscape. The physical and landscape restrictions were derived from the analysis of past events based on a time series of Landsat images (1985-2003), in order to identify the drivers of land use change and structure. Since the problem has multiple objectives, the GA integrates multi-objective extensions allowing it to evolve a set of non-dominated solutions. An evolutive type algorithm – Evolutive strategy (1+1) – is used, due to the need to accommodate the very large solution space. Current applications have about 1000 decision variables, while the problem analysed by GeneticLand has almost 111000, generated by a landscape with 333*333 discrete pixels. GeneticLand is developed and validated for a Mediterranean type landscape located in southern Portugal. Future climate triggers, such as the increase of intense rainfall episodes, is accommodated to simulate climate change . This paper presents: (1) the formulation of land use modelling as an optimization problem; (2) the formulation of the GA for the explicit spatial domain, (3) the land use constraints derived for a Mediterranean landscape, (4) the results illustrating conflicting objectives, and (5) limitations encountered.
Handling High-Level Model Changes Using Search Based Software Engineering
Model-Driven Engineering (MDE) considers models as first-class artifacts during the software
lifecycle. The number of available tools, techniques, and approaches for MDE is increasing as its
use gains traction in driving quality, and controlling cost in evolution of large software systems.
Software models, defined as code abstractions, are iteratively refined, restructured, and evolved.
This is due to many reasons such as fixing defects in design, reflecting changes in requirements,
and modifying a design to enhance existing features.
In this work, we focus on four main problems related to the evolution of software models: 1) the
detection of applied model changes, 2) merging parallel evolved models, 3) detection of design
defects in merged model, and 4) the recommendation of new changes to fix defects in software
models.
Regarding the first contribution, a-posteriori multi-objective change detection approach has been
proposed for evolved models. The changes are expressed in terms of atomic and composite
refactoring operations. The majority of existing approaches detects atomic changes but do not
adequately address composite changes which mask atomic operations in intermediate models.
For the second contribution, several approaches exist to construct a merged model by
incorporating all non-conflicting operations of evolved models. Conflicts arise when the
application of one operation disables the applicability of another one. The essence of the problem
is to identify and prioritize conflicting operations based on importance and context – a gap in
existing approaches. This work proposes a multi-objective formulation of model merging that
aims to maximize the number of successfully applied merged operations.
For the third and fourth contributions, the majority of existing works focuses on refactoring at
source code level, and does not exploit the benefits of software design optimization at model
level. However, refactoring at model level is inherently more challenging due to difficulty in
assessing the potential impact on structural and behavioral features of the software system. This requires analysis of class and activity diagrams to appraise the overall system quality, feasibility,
and inter-diagram consistency. This work focuses on designing, implementing, and evaluating a
multi-objective refactoring framework for detection and fixing of design defects in software
models.Ph.D.Information Systems Engineering, College of Engineering and Computer ScienceUniversity of Michigan-Dearbornhttp://deepblue.lib.umich.edu/bitstream/2027.42/136077/1/Usman Mansoor Final.pdfDescription of Usman Mansoor Final.pdf : Dissertatio
From Social Simulation to Integrative System Design
As the recent financial crisis showed, today there is a strong need to gain
"ecological perspective" of all relevant interactions in
socio-economic-techno-environmental systems. For this, we suggested to set-up a
network of Centers for integrative systems design, which shall be able to run
all potentially relevant scenarios, identify causality chains, explore feedback
and cascading effects for a number of model variants, and determine the
reliability of their implications (given the validity of the underlying
models). They will be able to detect possible negative side effect of policy
decisions, before they occur. The Centers belonging to this network of
Integrative Systems Design Centers would be focused on a particular field, but
they would be part of an attempt to eventually cover all relevant areas of
society and economy and integrate them within a "Living Earth Simulator". The
results of all research activities of such Centers would be turned into
informative input for political Decision Arenas. For example, Crisis
Observatories (for financial instabilities, shortages of resources,
environmental change, conflict, spreading of diseases, etc.) would be connected
with such Decision Arenas for the purpose of visualization, in order to make
complex interdependencies understandable to scientists, decision-makers, and
the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c
- …