2,897 research outputs found
An Efficient Dynamic Programming Algorithm for the Generalized LCS Problem with Multiple Substring Exclusion Constrains
In this paper, we consider a generalized longest common subsequence problem
with multiple substring exclusion constrains. For the two input sequences
and of lengths and , and a set of constrains
of total length , the problem is to find a common subsequence of and
excluding each of constrain string in as a substring and the length of
is maximized. The problem was declared to be NP-hard\cite{1}, but we
finally found that this is not true. A new dynamic programming solution for
this problem is presented in this paper. The correctness of the new algorithm
is proved. The time complexity of our algorithm is .Comment: arXiv admin note: substantial text overlap with arXiv:1301.718
Optimal trajectory generation in ocean flows
In this paper it is shown that Lagrangian Coherent
Structures (LCS) are useful in determining near optimal
trajectories for autonomous underwater gliders in a dynamic
ocean environment. This opens the opportunity for optimal
path planning of autonomous underwater vehicles by studying
the global flow geometry via dynamical systems methods. Optimal
glider paths were computed for a 2-dimensional kinematic
model of an end-point glider problem. Numerical solutions to
the optimal control problem were obtained using Nonlinear
Trajectory Generation (NTG) software. The resulting solution
is compared to corresponding results on LCS obtained using
the Direct Lyapunov Exponent method. The velocity data
used for these computations was obtained from measurements
taken in August, 2000, by HF-Radar stations located around
Monterey Bay, CA
Multivariate Fine-Grained Complexity of Longest Common Subsequence
We revisit the classic combinatorial pattern matching problem of finding a
longest common subsequence (LCS). For strings and of length , a
textbook algorithm solves LCS in time , but although much effort has
been spent, no -time algorithm is known. Recent work
indeed shows that such an algorithm would refute the Strong Exponential Time
Hypothesis (SETH) [Abboud, Backurs, Vassilevska Williams + Bringmann,
K\"unnemann FOCS'15].
Despite the quadratic-time barrier, for over 40 years an enduring scientific
interest continued to produce fast algorithms for LCS and its variations.
Particular attention was put into identifying and exploiting input parameters
that yield strongly subquadratic time algorithms for special cases of interest,
e.g., differential file comparison. This line of research was successfully
pursued until 1990, at which time significant improvements came to a halt. In
this paper, using the lens of fine-grained complexity, our goal is to (1)
justify the lack of further improvements and (2) determine whether some special
cases of LCS admit faster algorithms than currently known.
To this end, we provide a systematic study of the multivariate complexity of
LCS, taking into account all parameters previously discussed in the literature:
the input size , the length of the shorter string
, the length of an LCS of and , the numbers of
deletions and , the alphabet size, as well as
the numbers of matching pairs and dominant pairs . For any class of
instances defined by fixing each parameter individually to a polynomial in
terms of the input size, we prove a SETH-based lower bound matching one of
three known algorithms. Specifically, we determine the optimal running time for
LCS under SETH as .
[...]Comment: Presented at SODA'18. Full Version. 66 page
- âŚ