24,893 research outputs found
A reusable iterative optimization software library to solve combinatorial problems with approximate reasoning
Real world combinatorial optimization problems such as scheduling are
typically too complex to solve with exact methods. Additionally, the problems
often have to observe vaguely specified constraints of different importance,
the available data may be uncertain, and compromises between antagonistic
criteria may be necessary. We present a combination of approximate reasoning
based constraints and iterative optimization based heuristics that help to
model and solve such problems in a framework of C++ software libraries called
StarFLIP++. While initially developed to schedule continuous caster units in
steel plants, we present in this paper results from reusing the library
components in a shift scheduling system for the workforce of an industrial
production plant.Comment: 33 pages, 9 figures; for a project overview see
http://www.dbai.tuwien.ac.at/proj/StarFLIP
Family of 2-simplex cognitive tools and their application for decision-making and its justifications
Urgency of application and development of cognitive graphic tools for usage
in intelligent systems of data analysis, decision making and its justifications
is given. Cognitive graphic tool "2-simplex prism" and examples of its usage
are presented. Specificity of program realization of cognitive graphics tools
invariant to problem areas is described. Most significant results are given and
discussed. Future investigations are connected with usage of new approach to
rendering, cross-platform realization, cognitive features improving and
expanding of n-simplex family.Comment: 14 pages, 6 figures, conferenc
Metamodel Instance Generation: A systematic literature review
Modelling and thus metamodelling have become increasingly important in
Software Engineering through the use of Model Driven Engineering. In this paper
we present a systematic literature review of instance generation techniques for
metamodels, i.e. the process of automatically generating models from a given
metamodel. We start by presenting a set of research questions that our review
is intended to answer. We then identify the main topics that are related to
metamodel instance generation techniques, and use these to initiate our
literature search. This search resulted in the identification of 34 key papers
in the area, and each of these is reviewed here and discussed in detail. The
outcome is that we are able to identify a knowledge gap in this field, and we
offer suggestions as to some potential directions for future research.Comment: 25 page
Proving soundness of combinatorial Vickrey auctions and generating verified executable code
Using mechanised reasoning we prove that combinatorial Vickrey auctions are
soundly specified in that they associate a unique outcome (allocation and
transfers) to any valid input (bids). Having done so, we auto-generate verified
executable code from the formally defined auction. This removes a source of
error in implementing the auction design. We intend to use formal methods to
verify new auction designs. Here, our contribution is to introduce and
demonstrate the use of formal methods for auction verification in the familiar
setting of a well-known auction
Big Data Testing Techniques: Taxonomy, Challenges and Future Trends
Big Data is reforming many industrial domains by providing decision support
through analyzing large data volumes. Big Data testing aims to ensure that Big
Data systems run smoothly and error-free while maintaining the performance and
quality of data. However, because of the diversity and complexity of data,
testing Big Data is challenging. Though numerous research efforts deal with Big
Data testing, a comprehensive review to address testing techniques and
challenges of Big Data is not available as yet. Therefore, we have
systematically reviewed the Big Data testing techniques evidence occurring in
the period 2010-2021. This paper discusses testing data processing by
highlighting the techniques used in every processing phase. Furthermore, we
discuss the challenges and future directions. Our findings show that diverse
functional, non-functional and combined (functional and non-functional) testing
techniques have been used to solve specific problems related to Big Data. At
the same time, most of the testing challenges have been faced during the
MapReduce validation phase. In addition, the combinatorial testing technique is
one of the most applied techniques in combination with other techniques (i.e.,
random testing, mutation testing, input space partitioning and equivalence
testing) to find various functional faults through Big Data testing.Comment: 32 page
Disentangling agglomeration and network externalities : a conceptual typology
Agglomeration and network externalities are fuzzy concepts. When different meanings are (un)intentionally juxtaposed in analyses of the agglomeration/network externalities-menagerie, researchers may reach inaccurate conclusions about how they interlock. Both externality types can be analytically combined, but only when one adopts a coherent approach to their conceptualization and operationalization, to which end we provide a combinatorial typology. We illustrate the typology by applying a state-of-the-art bipartite network projection detailing the presence of globalized producer services firms in cities in 2012. This leads to two one-mode graphs that can be validly interpreted as topological renderings of agglomeration and network externalities
Genetic algorithms with guided and local search strategies for university course timetabling
This article is posted here with permission from the IEEE - Copyright @ 2011 IEEEThe university course timetabling problem (UCTP) is a combinatorial optimization problem, in which a set of events has to be scheduled into time slots and located into suitable rooms. The design of course timetables for academic institutions is a very difficult task because it is an NP-hard problem. This paper investigates genetic algorithms (GAs) with a guided search strategy and local search (LS) techniques for the UCTP. The guided search strategy is used to create offspring into the population based on a data structure that stores information extracted from good individuals of previous generations. The LS techniques use their exploitive search ability to improve the search efficiency of the proposed GAs and the quality of individuals. The proposed GAs are tested on two sets of benchmark problems in comparison with a set of state-of-the-art methods from the literature. The experimental results show that the proposed GAs are able to produce promising results for the UCTP.This work was supported by the Engineering and Physical Sciences Research Council of U.K. under Grant EP/E060722/1
- …