4 research outputs found
New simple efficient algorithms computing powers and runs in strings
International audienceRun in a string Square in a string Cube in a string Dictionary of Basic Factors a b s t r a c t Three new simple O(n log n) time algorithms related to repeating factors are presented in the paper. The first two algorithms employ only a basic textual data structure called the Dictionary of Basic Factors. Despite their simplicity these algorithms not only detect existence of powers (in particular, squares) in a string but also find all primitively rooted cubes (as well as higher powers) and all cubic runs. Our third O(n log n) time algorithm computes all runs and is probably the simplest known efficient algorithm for this problem. It uses additionally the Longest Common Extension function, however, due to relaxed running time constraints, a simple O(n log n) time implementation can be used. At the cost of logarithmic factor (in time complexity) we obtain novel algorithmic solutions for several classical string problems which are much simpler than (usually quite sophisticated) linear time algorithms
Recommended from our members
Network Designs Via Signaling Dynamics On Geometric Dynamic Graphs
Artificial neural networks are treated as black boxes. Generally,only the states of a subset of the network are considered to determine its efficacy, while the relationship between a neural network’s topology and its function remains under-theorized. For my analysis, I use a new class of event-driven recurrent neural networks—a geometric dynamic network modeled on canonical neurobiological signaling principles that allows to directly encode input data into its evolving dynamics—to forward a new type of machine learning approach. I accomplish this by first, mapping causal neuronal signal flows in the C. elegans connectome to show how the dynamic evolution of signal flows results in a unique internal representation of particular input data. Second, I propose two distinct approaches to determine the upper-bound for the amount of network dynamics needed for capturing the signaling evolution of the system. Using the upper-bound values, I construct a mathematical object representing the causal neuronal signaling dynamics, and delineate the interaction of sub-sub structures at various scales/heights of sub-graphs. Finally, based on recent theoretical propositions regarding optimal signaling in a geometric dynamic network, I show that neurons modify their axonal morphology so that the propagation time of an action potential, and the membrane’s refractory period become balanced. Thus, this work not only lays the foundation to construct and analyze a new class of artificial neural networks whose overall behavior and underlying dynamics are transparently coupled, it also provides fertile grounds for future work on biologically inspired artificial intelligence