124,055 research outputs found
Negatively Correlated Search
Evolutionary Algorithms (EAs) have been shown to be powerful tools for
complex optimization problems, which are ubiquitous in both communication and
big data analytics. This paper presents a new EA, namely Negatively Correlated
Search (NCS), which maintains multiple individual search processes in parallel
and models the search behaviors of individual search processes as probability
distributions. NCS explicitly promotes negatively correlated search behaviors
by encouraging differences among the probability distributions (search
behaviors). By this means, individual search processes share information and
cooperate with each other to search diverse regions of a search space, which
makes NCS a promising method for non-convex optimization. The cooperation
scheme of NCS could also be regarded as a novel diversity preservation scheme
that, different from other existing schemes, directly promotes diversity at the
level of search behaviors rather than merely trying to maintain diversity among
candidate solutions. Empirical studies showed that NCS is competitive to
well-established search methods in the sense that NCS achieved the best overall
performance on 20 multimodal (non-convex) continuous optimization problems. The
advantages of NCS over state-of-the-art approaches are also demonstrated with a
case study on the synthesis of unequally spaced linear antenna arrays
Prioritized Random MAC Optimization via Graph-based Analysis
Motivated by the analogy between successive interference cancellation and
iterative belief-propagation on erasure channels, irregular repetition slotted
ALOHA (IRSA) strategies have received a lot of attention in the design of
medium access control protocols. The IRSA schemes have been mostly analyzed for
theoretical scenarios for homogenous sources, where they are shown to
substantially improve the system performance compared to classical slotted
ALOHA protocols. In this work, we consider generic systems where sources in
different importance classes compete for a common channel. We propose a new
prioritized IRSA algorithm and derive the probability to correctly resolve
collisions for data from each source class. We then make use of our theoretical
analysis to formulate a new optimization problem for selecting the transmission
strategies of heterogenous sources. We optimize both the replication
probability per class and the source rate per class, in such a way that the
overall system utility is maximized. We then propose a heuristic-based
algorithm for the selection of the transmission strategy, which is built on
intrinsic characteristics of the iterative decoding methods adopted for
recovering from collisions. Experimental results validate the accuracy of the
theoretical study and show the gain of well-chosen prioritized transmission
strategies for transmission of data from heterogenous classes over shared
wireless channels
Short expressions of permutations as products and cryptanalysis of the Algebraic Eraser
On March 2004, Anshel, Anshel, Goldfeld, and Lemieux introduced the
\emph{Algebraic Eraser} scheme for key agreement over an insecure channel,
using a novel hybrid of infinite and finite noncommutative groups. They also
introduced the \emph{Colored Burau Key Agreement Protocol (CBKAP)}, a concrete
realization of this scheme.
We present general, efficient heuristic algorithms, which extract the shared
key out of the public information provided by CBKAP. These algorithms are,
according to heuristic reasoning and according to massive experiments,
successful for all sizes of the security parameters, assuming that the keys are
chosen with standard distributions.
Our methods come from probabilistic group theory (permutation group actions
and expander graphs). In particular, we provide a simple algorithm for finding
short expressions of permutations in , as products of given random
permutations. Heuristically, our algorithm gives expressions of length
, in time and space . Moreover, this is provable from
\emph{the Minimal Cycle Conjecture}, a simply stated hypothesis concerning the
uniform distribution on . Experiments show that the constants in these
estimations are small. This is the first practical algorithm for this problem
for .
Remark: \emph{Algebraic Eraser} is a trademark of SecureRF. The variant of
CBKAP actually implemented by SecureRF uses proprietary distributions, and thus
our results do not imply its vulnerability. See also arXiv:abs/12020598Comment: Final version, accepted to Advances in Applied Mathematics. Title
slightly change
Decision Factors for Cooperative Multiple Warhead UAV Target Classification and Attack with Control Applications
Autonomous wide area search, classification and attack using Unmanned Combat Air Vehicles (UCAVs) is considered. The wide area search and attack scenario is modeled, capturing important problem parameters related to environment, seeker, and munitions. Probabilistic analysis is used to formulate and analytically solve for various probabilities, including the probability of mission success. Two methods are utilized. The first examines the sub-events required for various events to occur. The second utilizes a Markov chain approach. General expressions are first obtained that are applicable to any assumed a priori distributions of targets and false targets. These expressions are subsequently applied to a multiple warhead munition/UCAV operating in several multiple target/multiple false target scenarios. Examples of application of the analytically derived results are given for all facets of the system design and operation of Wide Area Search Munitions including the evaluation of cooperation schemes and rules of engagement. The problem is formulated as a control problem, and the possibility of adaptive control based on estimation of environmental parameters is examined
Organic Design of Massively Distributed Systems: A Complex Networks Perspective
The vision of Organic Computing addresses challenges that arise in the design
of future information systems that are comprised of numerous, heterogeneous,
resource-constrained and error-prone components or devices. Here, the notion
organic particularly highlights the idea that, in order to be manageable, such
systems should exhibit self-organization, self-adaptation and self-healing
characteristics similar to those of biological systems. In recent years, the
principles underlying many of the interesting characteristics of natural
systems have been investigated from the perspective of complex systems science,
particularly using the conceptual framework of statistical physics and
statistical mechanics. In this article, we review some of the interesting
relations between statistical physics and networked systems and discuss
applications in the engineering of organic networked computing systems with
predictable, quantifiable and controllable self-* properties.Comment: 17 pages, 14 figures, preprint of submission to Informatik-Spektrum
published by Springe
Average-Case Complexity
We survey the average-case complexity of problems in NP.
We discuss various notions of good-on-average algorithms, and present
completeness results due to Impagliazzo and Levin. Such completeness results
establish the fact that if a certain specific (but somewhat artificial) NP
problem is easy-on-average with respect to the uniform distribution, then all
problems in NP are easy-on-average with respect to all samplable distributions.
Applying the theory to natural distributional problems remain an outstanding
open question. We review some natural distributional problems whose
average-case complexity is of particular interest and that do not yet fit into
this theory.
A major open question whether the existence of hard-on-average problems in NP
can be based on the PNP assumption or on related worst-case assumptions.
We review negative results showing that certain proof techniques cannot prove
such a result. While the relation between worst-case and average-case
complexity for general NP problems remains open, there has been progress in
understanding the relation between different ``degrees'' of average-case
complexity. We discuss some of these ``hardness amplification'' results
- …