42,539 research outputs found
Algorithms for the Problems of Length-Constrained Heaviest Segments
We present algorithms for length-constrained maximum sum segment and maximum
density segment problems, in particular, and the problem of finding
length-constrained heaviest segments, in general, for a sequence of real
numbers. Given a sequence of n real numbers and two real parameters L and U (L
<= U), the maximum sum segment problem is to find a consecutive subsequence,
called a segment, of length at least L and at most U such that the sum of the
numbers in the subsequence is maximum. The maximum density segment problem is
to find a segment of length at least L and at most U such that the density of
the numbers in the subsequence is the maximum. For the first problem with
non-uniform width there is an algorithm with time and space complexities in
O(n). We present an algorithm with time complexity in O(n) and space complexity
in O(U). For the second problem with non-uniform width there is a combinatorial
solution with time complexity in O(n) and space complexity in O(U). We present
a simple geometric algorithm with the same time and space complexities.
We extend our algorithms to respectively solve the length-constrained k
maximum sum segments problem in O(n+k) time and O(max{U, k}) space, and the
length-constrained maximum density segments problem in O(n min{k, U-L})
time and O(U+k) space. We present extensions of our algorithms to find all the
length-constrained segments having user specified sum and density in O(n+m) and
O(nlog (U-L)+m) times respectively, where m is the number of output.
Previously, there was no known algorithm with non-trivial result for these
problems. We indicate the extensions of our algorithms to higher dimensions.
All the algorithms can be extended in a straight forward way to solve the
problems with non-uniform width and non-uniform weight.Comment: 21 pages, 12 figure
Boundary Treatment and Multigrid Preconditioning for Semi-Lagrangian Schemes Applied to Hamilton-Jacobi-Bellman Equations
We analyse two practical aspects that arise in the numerical solution of
Hamilton-Jacobi-Bellman (HJB) equations by a particular class of monotone
approximation schemes known as semi-Lagrangian schemes. These schemes make use
of a wide stencil to achieve convergence and result in discretization matrices
that are less sparse and less local than those coming from standard finite
difference schemes. This leads to computational difficulties not encountered
there. In particular, we consider the overstepping of the domain boundary and
analyse the accuracy and stability of stencil truncation. This truncation
imposes a stricter CFL condition for explicit schemes in the vicinity of
boundaries than in the interior, such that implicit schemes become attractive.
We then study the use of geometric, algebraic and aggregation-based multigrid
preconditioners to solve the resulting discretised systems from implicit time
stepping schemes efficiently. Finally, we illustrate the performance of these
techniques numerically for benchmark test cases from the literature
Gradient methods for convex minimization: better rates under weaker conditions
The convergence behavior of gradient methods for minimizing convex
differentiable functions is one of the core questions in convex optimization.
This paper shows that their well-known complexities can be achieved under
conditions weaker than the commonly accepted ones. We relax the common gradient
Lipschitz-continuity condition and strong convexity condition to ones that hold
only over certain line segments. Specifically, we establish complexities
and for the ordinary and
accelerate gradient methods, respectively, assuming that is
Lipschitz continuous with constant over the line segment joining and
for each x\in\dom f. Then we improve them to
and
for function that also
satisfies the secant inequality
for each x\in \dom f and its projection to the minimizer set of .
The secant condition is also shown to be necessary for the geometric decay of
solution error. Not only are the relaxed conditions met by more functions, the
restrictions give smaller and larger than they are without the
restrictions and thus lead to better complexity bounds. We apply these results
to sparse optimization and demonstrate a faster algorithm.Comment: 20 pages, 4 figures, typos are corrected, Theorem 2 is ne
Upright posture and the meaning of meronymy: A synthesis of metaphoric and analytic accounts
Cross-linguistic strategies for mapping lexical and spatial relations from body partonym systems to external object meronymies (as in English ‘table leg’, ‘mountain face’) have attracted substantial research and debate over the past three decades. Due to the systematic mappings, lexical productivity and geometric complexities of body-based meronymies found in many Mesoamerican languages, the region has become focal for these discussions, prominently including contrastive accounts of the phenomenon in Zapotec and Tzeltal, leading researchers to question whether such systems should be explained as global metaphorical mappings from bodily source to target holonym or as vector mappings of shape and axis generated “algorithmically”. I propose a synthesis of these accounts in this paper by drawing on the species-specific cognitive affordances of human upright posture grounded in the reorganization of the anatomical planes, with a special emphasis on antisymmetrical relations that emerge between arm-leg and face-groin antinomies cross-culturally. Whereas Levinson argues that the internal geometry of objects “stripped of their bodily associations” (1994: 821) is sufficient to account for Tzeltal meronymy, making metaphorical explanations entirely unnecessary, I propose a more powerful, elegant explanation of Tzeltal meronymic mapping that affirms both the geometric-analytic and the global-metaphorical nature of Tzeltal meaning construal. I do this by demonstrating that the “algorithm” in question arises from the phenomenology of movement and correlative body memories—an experiential ground which generates a culturally selected pair of inverse contrastive paradigm sets with marked and unmarked membership emerging antithetically relative to the transverse anatomical plane. These relations are then selected diagrammatically for the classification of object orientations according to systematic geometric iconicities. Results not only serve to clarify the case in question but also point to the relatively untapped potential that upright posture holds for theorizing the emergence of human cognition, highlighting in the process the nature, origins and theoretical validity of markedness and double scope conceptual integration
Locating regions in a sequence under density constraints
Several biological problems require the identification of regions in a
sequence where some feature occurs within a target density range: examples
including the location of GC-rich regions, identification of CpG islands, and
sequence matching. Mathematically, this corresponds to searching a string of 0s
and 1s for a substring whose relative proportion of 1s lies between given lower
and upper bounds. We consider the algorithmic problem of locating the longest
such substring, as well as other related problems (such as finding the shortest
substring or a maximal set of disjoint substrings). For locating the longest
such substring, we develop an algorithm that runs in O(n) time, improving upon
the previous best-known O(n log n) result. For the related problems we develop
O(n log log n) algorithms, again improving upon the best-known O(n log n)
results. Practical testing verifies that our new algorithms enjoy significantly
smaller time and memory footprints, and can process sequences that are orders
of magnitude longer as a result.Comment: 17 pages, 8 figures; v2: minor revisions, additional explanations; to
appear in SIAM Journal on Computin
- …