1,006,820 research outputs found
The robust single machine scheduling problem with uncertain release and processing times
In this work, we study the single machine scheduling problem with uncertain
release times and processing times of jobs. We adopt a robust scheduling
approach, in which the measure of robustness to be minimized for a given
sequence of jobs is the worst-case objective function value from the set of all
possible realizations of release and processing times. The objective function
value is the total flow time of all jobs. We discuss some important properties
of robust schedules for zero and non-zero release times, and illustrate the
added complexity in robust scheduling given non-zero release times. We propose
heuristics based on variable neighborhood search and iterated local search to
solve the problem and generate robust schedules. The algorithms are tested and
their solution performance is compared with optimal solutions or lower bounds
through numerical experiments based on synthetic data
Addressing the Crisis in Fundamental Physics
I present the case for fundamental physics experiments in space playing an
important role in addressing the current "dark energy'' crisis. If cosmological
observations continue to favor a value of the dark energy equation of state
parameter w=-1, with no change over cosmic time, then we will have difficulty
understanding this new fundamental physics. We will then face a very real risk
of stagnation unless we detect some other experimental anomaly. The advantages
of space-based experiments could prove invaluable in the search for the a more
complete understanding of dark energy. This talk was delivered at the start of
the Fundamental Physics Research in Space Workshop in May 2006.Comment: 11 pages, Opening talk presented at the 2006 Workshop on Fundamental
Physics in Space. Submitted to Int'l Journal of Modern Physics,
DiffNodesets: An Efficient Structure for Fast Mining Frequent Itemsets
Mining frequent itemsets is an essential problem in data mining and plays an
important role in many data mining applications. In recent years, some itemset
representations based on node sets have been proposed, which have shown to be
very efficient for mining frequent itemsets. In this paper, we propose
DiffNodeset, a novel and more efficient itemset representation, for mining
frequent itemsets. Based on the DiffNodeset structure, we present an efficient
algorithm, named dFIN, to mining frequent itemsets. To achieve high efficiency,
dFIN finds frequent itemsets using a set-enumeration tree with a hybrid search
strategy and directly enumerates frequent itemsets without candidate generation
under some case. For evaluating the performance of dFIN, we have conduct
extensive experiments to compare it against with existing leading algorithms on
a variety of real and synthetic datasets. The experimental results show that
dFIN is significantly faster than these leading algorithms.Comment: 22 pages, 13 figure
A Unified Approach to the Classical Statistical Analysis of Small Signals
We give a classical confidence belt construction which unifies the treatment
of upper confidence limits for null results and two-sided confidence intervals
for non-null results. The unified treatment solves a problem (apparently not
previously recognized) that the choice of upper limit or two-sided intervals
leads to intervals which are not confidence intervals if the choice is based on
the data. We apply the construction to two related problems which have recently
been a battle-ground between classical and Bayesian statistics: Poisson
processes with background, and Gaussian errors with a bounded physical region.
In contrast with the usual classical construction for upper limits, our
construction avoids unphysical confidence intervals. In contrast with some
popular Bayesian intervals, our intervals eliminate conservatism (frequentist
coverage greater than the stated confidence) in the Gaussian case and reduce it
to a level dictated by discreteness in the Poisson case. We generalize the
method in order to apply it to analysis of experiments searching for neutrino
oscillations. We show that this technique both gives correct coverage and is
powerful, while other classical techniques that have been used by neutrino
oscillation search experiments fail one or both of these criteria.Comment: 40 pages, 15 figures. Changes 15-Dec-99 to agree more closely with
published version. A few small changes, plus the two substantive changes we
made in proof back in 1998: 1) The definition of "sensitivity" in Sec. V(C).
It was inconsistent with our actual definition in Sec. VI. 2) "Note added in
proof" at end of the Conclusio
Computing Maximum Agreement Forests without Cluster Partitioning is Folly
Computing a maximum (acyclic) agreement forest (M(A)AF) of a pair of phylogenetic trees is known to be fixed-parameter tractable; the two main techniques are kernelization and depth-bounded search. In theory, kernelization-based algorithms for this problem are not competitive, but they perform remarkably well in practice. We shed light on why this is the case. Our results show that, probably unsurprisingly, the kernel is often much smaller in practice than the theoretical worst case, but not small enough to fully explain the good performance of these algorithms. The key to performance is cluster partitioning, a technique used in almost all fast M(A)AF algorithms. In theory, cluster partitioning does not help: some instances are highly clusterable, others not at all. However, our experiments show that cluster partitioning leads to substantial performance improvements for kernelization-based M(A)AF algorithms. In contrast, kernelizing the individual clusters before solving them using exponential search yields only very modest performance improvements or even hurts performance; for the vast majority of inputs, kernelization leads to no reduction in the maximal cluster size at all. The choice of the algorithm applied to solve individual clusters also significantly impacts performance, even though our limited experiment to evaluate this produced no clear winner; depth-bounded search, exponential search interleaved with kernelization, and an ILP-based algorithm all achieved competitive performance
Search-Based Mutant Selection for Efficient Test Suite Improvement: Evaluation and Results
Context: Search-based techniques have been applied to almost all areas in software engineering, especially to software testing, seeking to solve hard optimization problems. However, the problem of selecting mutants to improve the test suite at a lower cost has not been explored to the same extent as other problems, such as mutant selection for test suite evaluation or test data generation.
Objective: In this paper, we apply search-based mutant selection to enhance the quality of test suites efficiently. Namely, we use the technique known as Evolutionary Mutation Testing (EMT), which allows reducing the number of mutants while preserving the power to refine the test suite. Despite reported
benefits of its application, the existing empirical results were derived from a limited number of case studies, a particular set of mutation operators and a vague measure, which currently makes it difficult to determine the real performance of this technique.
Method: This paper addresses the shortcomings of previous studies, providing a new methodology to evaluate EMT on the basis of the actual improvement of the test suite achieved by using the evolutionary strategy. We make use of that methodology in new experiments with a carefully selected set of real-world C++ case studies. Results: EMT shows a good performance for most case studies and levels of demand of test suite improvement (around 45% less mutants than random selection in the best case). The results reveal that even a reduced subset of mutants selected with EMT can serve to increase confidence in the test suite, especially in programs with a large set of mutants.
Conclusions: These results support the use of search-based techniques to solve the problem of mutant selection for a more efficient test suite refinement. Additionally, we identify some aspects that could foreseeably help enhance EMT
- …