6,060 research outputs found
Approximation and Parameterized Complexity of Minimax Approval Voting
We present three results on the complexity of Minimax Approval Voting. First,
we study Minimax Approval Voting parameterized by the Hamming distance from
the solution to the votes. We show Minimax Approval Voting admits no algorithm
running in time , unless the Exponential
Time Hypothesis (ETH) fails. This means that the
algorithm of Misra et al. [AAMAS 2015] is essentially optimal. Motivated by
this, we then show a parameterized approximation scheme, running in time
, which is essentially
tight assuming ETH. Finally, we get a new polynomial-time randomized
approximation scheme for Minimax Approval Voting, which runs in time
,
almost matching the running time of the fastest known PTAS for Closest String
due to Ma and Sun [SIAM J. Comp. 2009].Comment: 14 pages, 3 figures, 2 pseudocode
Average Sensitivity of Graph Algorithms
In modern applications of graphs algorithms, where the graphs of interest are
large and dynamic, it is unrealistic to assume that an input representation
contains the full information of a graph being studied. Hence, it is desirable
to use algorithms that, even when only a (large) subgraph is available, output
solutions that are close to the solutions output when the whole graph is
available. We formalize this idea by introducing the notion of average
sensitivity of graph algorithms, which is the average earth mover's distance
between the output distributions of an algorithm on a graph and its subgraph
obtained by removing an edge, where the average is over the edges removed and
the distance between two outputs is the Hamming distance.
In this work, we initiate a systematic study of average sensitivity. After
deriving basic properties of average sensitivity such as composition, we
provide efficient approximation algorithms with low average sensitivities for
concrete graph problems, including the minimum spanning forest problem, the
global minimum cut problem, the minimum - cut problem, and the maximum
matching problem. In addition, we prove that the average sensitivity of our
global minimum cut algorithm is almost optimal, by showing a nearly matching
lower bound. We also show that every algorithm for the 2-coloring problem has
average sensitivity linear in the number of vertices. One of the main ideas
involved in designing our algorithms with low average sensitivity is the
following fact; if the presence of a vertex or an edge in the solution output
by an algorithm can be decided locally, then the algorithm has a low average
sensitivity, allowing us to reuse the analyses of known sublinear-time
algorithms and local computation algorithms (LCAs). Using this connection, we
show that every LCA for 2-coloring has linear query complexity, thereby
answering an open question.Comment: 39 pages, 1 figur
A Duality Based 2-Approximation Algorithm for Maximum Agreement Forest
We give a 2-approximation algorithm for the Maximum Agreement Forest problem
on two rooted binary trees. This NP-hard problem has been studied extensively
in the past two decades, since it can be used to compute the rooted Subtree
Prune-and-Regraft (rSPR) distance between two phylogenetic trees. Our algorithm
is combinatorial and its running time is quadratic in the input size. To prove
the approximation guarantee, we construct a feasible dual solution for a novel
linear programming formulation. In addition, we show this linear program is
stronger than previously known formulations, and we give a compact formulation,
showing that it can be solved in polynomial tim
On the Closest Vector Problem with a Distance Guarantee
We present a substantially more efficient variant, both in terms of running
time and size of preprocessing advice, of the algorithm by Liu, Lyubashevsky,
and Micciancio for solving CVPP (the preprocessing version of the Closest
Vector Problem, CVP) with a distance guarantee. For instance, for any , our algorithm finds the (unique) closest lattice point for any target
point whose distance from the lattice is at most times the length of
the shortest nonzero lattice vector, requires as preprocessing advice only vectors, and runs in
time .
As our second main contribution, we present reductions showing that it
suffices to solve CVP, both in its plain and preprocessing versions, when the
input target point is within some bounded distance of the lattice. The
reductions are based on ideas due to Kannan and a recent sparsification
technique due to Dadush and Kun. Combining our reductions with the LLM
algorithm gives an approximation factor of for search
CVPP, improving on the previous best of due to Lagarias, Lenstra,
and Schnorr. When combined with our improved algorithm we obtain, somewhat
surprisingly, that only O(n) vectors of preprocessing advice are sufficient to
solve CVPP with (the only slightly worse) approximation factor of O(n).Comment: An early version of the paper was titled "On Bounded Distance
Decoding and the Closest Vector Problem with Preprocessing". Conference on
Computational Complexity (2014
Polynomial Time Approximation Schemes for All 1-Center Problems on Metric Rational Set Similarities
In this paper, we investigate algorithms for finding centers of a given collection N of sets. In particular, we focus on metric rational set similarities, a broad class of similarity measures including Jaccard and Hamming. A rational set similarity S is called metric if D= 1 - S is a distance function. We study the 1-center problem on these metric spaces. The problem consists of finding a set C that minimizes the maximum distance of C to any set of N. We present a general framework that computes a (1 + ε) approximation for any metric rational set similarity
Robust ASR using Support Vector Machines
The improved theoretical properties of Support Vector Machines with respect to other machine learning alternatives due to their max-margin training paradigm have led us to suggest them as a good technique for robust speech recognition. However, important shortcomings have had to be circumvented, the most important being the normalisation of the time duration of different realisations of the acoustic speech units.
In this paper, we have compared two approaches in noisy environments: first, a hybrid HMM–SVM solution where a fixed number of frames is selected by means of an HMM segmentation and second, a normalisation kernel called Dynamic Time Alignment Kernel (DTAK) first introduced in Shimodaira et al. [Shimodaira, H., Noma, K., Nakai, M., Sagayama, S., 2001. Support vector machine with dynamic time-alignment kernel for speech recognition. In: Proc. Eurospeech, Aalborg, Denmark, pp. 1841–1844] and based on DTW (Dynamic Time Warping). Special attention has been paid to the adaptation of both alternatives to noisy environments, comparing two types of parameterisations and performing suitable feature normalisation operations. The results show that the DTA Kernel provides important advantages over the baseline HMM system in medium to bad noise conditions, also outperforming the results of the hybrid system.Publicad
Recommended from our members
New Applications of the Nearest-Neighbor Chain Algorithm
The nearest-neighbor chain algorithm was proposed in the eighties as a way to speed up certain hierarchical clustering algorithms. In the first part of the dissertation, we show that its application is not limited to clustering. We apply it to a variety of geometric and combinatorial problems. In each case, we show that the nearest-neighbor chain algorithm finds the same solution as a preexistent greedy algorithm, but often with an improved runtime. We obtain speedups over greedy algorithms for Euclidean TSP, Steiner TSP in planar graphs, straight skeletons, a geometric coverage problem, and three stable matching models. In the second part, we study the stable-matching Voronoi diagram, a type of plane partition which combines properties of stable matchings and Voronoi diagrams. We propose political redistricting as an application. We also show that it is impossible to compute this diagram in an algebraic model of computation, and give three algorithmic approaches to overcome this obstacle. One of them is based on the nearest-neighbor chain algorithm, linking the two parts together
- …