332,032 research outputs found
Regression on fixed-rank positive semidefinite matrices: a Riemannian approach
The paper addresses the problem of learning a regression model parameterized
by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear
nature of the search space and on scalability to high-dimensional problems. The
mathematical developments rely on the theory of gradient descent algorithms
adapted to the Riemannian geometry that underlies the set of fixed-rank
positive semidefinite matrices. In contrast with previous contributions in the
literature, no restrictions are imposed on the range space of the learned
matrix. The resulting algorithms maintain a linear complexity in the problem
size and enjoy important invariance properties. We apply the proposed
algorithms to the problem of learning a distance function parameterized by a
positive semidefinite matrix. Good performance is observed on classical
benchmarks
Continuous dynamic problem generators for evolutionary algorithms
This article is posted here with permission from IEEE - Copyright @ 2007 IEEEAddressing dynamic optimization problems has attracted a growing interest from the evolutionary algorithm community in recent years due to its importance in the applications of evolutionary algorithms in real world problems. In order to study evolutionary algorithms in dynamic environments, one important work is to develop benchmark dynamic environments. This paper proposes two continuous dynamic problem generators. Both generators use linear transformation to move individuals, which preserves the distance among individuals. In the first generator, the linear transformation of individuals is equivalent to change the direction of some axes of the search space while in the second one it is obtained by successive rotations in different planes. Preliminary experiments were carried out to study the performance of some standard genetic algorithms in continuous dynamic environments created by the proposed generators
Negatively Correlated Search
Evolutionary Algorithms (EAs) have been shown to be powerful tools for
complex optimization problems, which are ubiquitous in both communication and
big data analytics. This paper presents a new EA, namely Negatively Correlated
Search (NCS), which maintains multiple individual search processes in parallel
and models the search behaviors of individual search processes as probability
distributions. NCS explicitly promotes negatively correlated search behaviors
by encouraging differences among the probability distributions (search
behaviors). By this means, individual search processes share information and
cooperate with each other to search diverse regions of a search space, which
makes NCS a promising method for non-convex optimization. The cooperation
scheme of NCS could also be regarded as a novel diversity preservation scheme
that, different from other existing schemes, directly promotes diversity at the
level of search behaviors rather than merely trying to maintain diversity among
candidate solutions. Empirical studies showed that NCS is competitive to
well-established search methods in the sense that NCS achieved the best overall
performance on 20 multimodal (non-convex) continuous optimization problems. The
advantages of NCS over state-of-the-art approaches are also demonstrated with a
case study on the synthesis of unequally spaced linear antenna arrays
Efficient Heuristic Search Algorithms for Soft-Decision Decoding of Linear Block Codes
This paper deals with maximum-likelihood soft-decision decoding as well as suboptimal soft-decision decoding of linear block codes. In this paper we present a novel and efficient hybrid decoding algorithm for (n, k) linear block codes. This algorithm consists of three new decoding algorithms: M A*, H*, and Directed Search. It hybridizes these three algorithms to take advantage of their strengths and make the decoding more efficient. The first algorithm, M A*, is a modified Algorithm A* that conducts a heuristic search through a code tree of the transmitted code when the decoding problem is transformed into a problem of graph-search through a code tree. M A* takes into consideration more properties of the code and is considerably more efficient than the original A* algorithm presented by Han, Hartmann, and Chen. The second algorithm, H*, is a new decoding algorithm that determines the value of every component of a minimum-cost codeword by estimating the cost of the minimum-cost codeword, which has a fixed value at one of the k most reliable, linearly independent bit positions when the decoding problem is transformed into a minimum-cost problem among all codewords of the transmitted code. The suboptimal version of this algorithm can be incorporated with other decoding algorithms to reduce the search space during the decoding process. The third algorithm, Directed Search, is a novel heuristic approach designed to enhance the performance of soft-decision decoding by searching in continuous space. This approach explores the search space between a given vector and the received vector and finds the closest codeword to the received vector in the space explored. Simulation results for this hybrid algorithm are presented for the (128, 64), the (256, 131 ), and the (256, 139) binary-extended BCH codes. This hybrid algorithm can efficiently decode the (128, 64) code for any signal-to-noise ratio and has near-optimal to optimal performance. Previously, no practical decoder could have decoded this code with such a performance for all ranges of signal-to-noise ratio
- …