1,167 research outputs found
On Computing Centroids According to the p-Norms of Hamming Distance Vectors
In this paper we consider the p-Norm Hamming Centroid problem which asks to determine whether some given strings have a centroid with a bound on the p-norm of its Hamming distances to the strings. Specifically, given a set S of strings and a real k, we consider the problem of determining whether there exists a string s^* with (sum_{s in S} d^{p}(s^*,s))^(1/p) <=k, where d(,) denotes the Hamming distance metric. This problem has important applications in data clustering and multi-winner committee elections, and is a generalization of the well-known polynomial-time solvable Consensus String (p=1) problem, as well as the NP-hard Closest String (p=infty) problem.
Our main result shows that the problem is NP-hard for all fixed rational p > 1, closing the gap for all rational values of p between 1 and infty. Under standard complexity assumptions the reduction also implies that the problem has no 2^o(n+m)-time or 2^o(k^(p/(p+1)))-time algorithm, where m denotes the number of input strings and n denotes the length of each string, for any fixed p > 1. The first bound matches a straightforward brute-force algorithm. The second bound is tight in the sense that for each fixed epsilon > 0, we provide a 2^(k^(p/((p+1))+epsilon))-time algorithm. In the last part of the paper, we complement our hardness result by presenting a fixed-parameter algorithm and a factor-2 approximation algorithm for the problem
Parameterized Complexity of Critical Node Cuts
We consider the following natural graph cut problem called Critical Node Cut
(CNC): Given a graph on vertices, and two positive integers and
, determine whether has a set of vertices whose removal leaves
with at most connected pairs of vertices. We analyze this problem in the
framework of parameterized complexity. That is, we are interested in whether or
not this problem is solvable in time (i.e., whether
or not it is fixed-parameter tractable), for various natural parameters
. We consider four such parameters:
- The size of the required cut.
- The upper bound on the number of remaining connected pairs.
- The lower bound on the number of connected pairs to be removed.
- The treewidth of .
We determine whether or not CNC is fixed-parameter tractable for each of
these parameters. We determine this also for all possible aggregations of these
four parameters, apart from . Moreover, we also determine whether or not
CNC admits a polynomial kernel for all these parameterizations. That is,
whether or not there is an algorithm that reduces each instance of CNC in
polynomial time to an equivalent instance of size , where
is the given parameter
Multidimensional linear cryptanalysis
Linear cryptanalysis is an important tool for studying the security of symmetric ciphers. In 1993 Matsui proposed two algorithms, called Algorithm 1 and Algorithm 2, for recovering information about the secret key of a block cipher. The algorithms exploit a biased probabilistic relation between the input and output of the cipher. This relation is called the (one-dimensional) linear approximation of the cipher. Mathematically, the problem of key recovery is a binary hypothesis testing problem that can be solved with appropriate statistical tools.
The same mathematical tools can be used for realising a distinguishing attack against a stream cipher. The distinguisher outputs whether the given sequence of keystream bits is derived from a cipher or a random source. Sometimes, it is even possible to recover a part of the initial state of the LFSR used in a key stream generator.
Several authors considered using many one-dimensional linear approximations simultaneously in a key recovery attack and various solutions have been proposed. In this thesis a unified methodology for using multiple linear approximations in distinguishing and key recovery attacks is presented. This methodology, which we call multidimensional linear cryptanalysis, allows removing unnecessary and restrictive assumptions. We model the key recovery problems mathematically as hypothesis testing problems and show how to use standard statistical tools for solving them. We also show how the data complexity of linear cryptanalysis on stream ciphers and block ciphers can be reduced by using multiple approximations.
We use well-known mathematical theory for comparing different statistical methods for solving the key recovery problems. We also test the theory in practice with reduced round Serpent. Based on our results, we give recommendations on how multidimensional linear cryptanalysis should be used
Strength Analyses of Wooden I-Beams with a Hole in the Web
Wood-based light-weight I-beams are today widely used in the construction industry. An important feature of these beams is that the user can make holes in the web where needed. Today there is no general method used to calculate the reduced strength of these beams with a hole in the web. The calculation methods vary between the manufacturers and are commonly based on empirical results. The aim of this masterâs thesis was to create ïŹnite element models of this type of beams and with these investigate the stress distribution in beams with holes in the web, where a crack would likely occur and in what direction it will grow. The aim was furthermore to calculate the shear force capacity for beams with holes by use of diïŹerent models based on fracture mechanics theory, as well as investigate how changes in the material properties inïŹuence the shear force capacity, and ïŹnally to evaluate the currently used calculation methods and suggest improvements or a new method. Calculations showing the location of the most stressed point and the orientation of the principal stresses in an area surrounding this point were performed for a number of load cases. For load cases dominated by shear force the results indicated diagonal cracking in 45⊠direction. The load cases with pure normal or moment loading indicated fracture in the upper or lower edge of the hole. Furthermore, the calculated stresses indicated that a crack would both initiate and continue to grow along an approximately straight line perpendicular to the edge of the hole. Three methods based on fracture mechanics were used in the finite element calculations of the shear force capacity; the point stress criterion, the mean stress criterion and the initial crack criterion. The calculated shear force capacity from these methods was compared to the shear force capacity gained in a previously performed test study. In this study 11 beam geometries were tested, and to be able to compare the calculations, the same geometries and load cases were used in the present study. The results show that the mean stress criterion and the initial crack criterion are suitable for shear force capacity calculations for beams with holes in the web. The point stress criterion severely underestimated the shear force capacity for some beams. The calculation method used by the manufacturers Swelite and Forestia was evaluated by comparing the results from the test study with the results from using this method. This comparison showed that this method overestimated the real shear force capacity for one beam. A new calculation method can be based on the mean stress criterion, since this method gave values well corresponding to the results from the test study and since this is a fairly easy method to use
Parameterized Two-Player Nash Equilibrium
We study the computation of Nash equilibria in a two-player normal form game
from the perspective of parameterized complexity. Recent results proved
hardness for a number of variants, when parameterized by the support size. We
complement those results, by identifying three cases in which the problem
becomes fixed-parameter tractable. These cases occur in the previously studied
settings of sparse games and unbalanced games as well as in the newly
considered case of locally bounded treewidth games that generalizes both these
two cases
Parameterized Complexity Dichotomy for Steiner Multicut
The Steiner Multicut problem asks, given an undirected graph G, terminals
sets T1,...,Tt V(G) of size at most p, and an integer k, whether
there is a set S of at most k edges or nodes s.t. of each set Ti at least one
pair of terminals is in different connected components of G \ S. This problem
generalizes several graph cut problems, in particular the Multicut problem (the
case p = 2), which is fixed-parameter tractable for the parameter k [Marx and
Razgon, Bousquet et al., STOC 2011].
We provide a dichotomy of the parameterized complexity of Steiner Multicut.
That is, for any combination of k, t, p, and the treewidth tw(G) as constant,
parameter, or unbounded, and for all versions of the problem (edge deletion and
node deletion with and without deletable terminals), we prove either that the
problem is fixed-parameter tractable or that the problem is hard (W[1]-hard or
even (para-)NP-complete). We highlight that:
- The edge deletion version of Steiner Multicut is fixed-parameter tractable
for the parameter k+t on general graphs (but has no polynomial kernel, even on
trees). We present two proofs: one using the randomized contractions technique
of Chitnis et al, and one relying on new structural lemmas that decompose the
Steiner cut into important separators and minimal s-t cuts.
- In contrast, both node deletion versions of Steiner Multicut are W[1]-hard
for the parameter k+t on general graphs.
- All versions of Steiner Multicut are W[1]-hard for the parameter k, even
when p=3 and the graph is a tree plus one node. Hence, the results of Marx and
Razgon, and Bousquet et al. do not generalize to Steiner Multicut.
Since we allow k, t, p, and tw(G) to be any constants, our characterization
includes a dichotomy for Steiner Multicut on trees (for tw(G) = 1), and a
polynomial time versus NP-hardness dichotomy (by restricting k,t,p,tw(G) to
constant or unbounded).Comment: As submitted to journal. This version also adds a proof of
fixed-parameter tractability for parameter k+t using the technique of
randomized contraction
- âŠ