16,681 research outputs found
Weighted dynamic finger in binary search trees
It is shown that the online binary search tree data structure GreedyASS
performs asymptotically as well on a sufficiently long sequence of searches as
any static binary search tree where each search begins from the previous search
(rather than the root). This bound is known to be equivalent to assigning each
item in the search tree a positive weight and bounding the search
cost of an item in the search sequence by
amortized. This result is the strongest finger-type bound to be proven for
binary search trees. By setting the weights to be equal, one observes that our
bound implies the dynamic finger bound. Compared to the previous proof of the
dynamic finger bound for Splay trees, our result is significantly shorter,
stronger, simpler, and has reasonable constants.Comment: An earlier version of this work appeared in the Proceedings of the
Twenty-Seventh Annual ACM-SIAM Symposium on Discrete Algorithm
Recommended from our members
Bridging between sensor measurements and symbolic ontologies through conceptual spaces
The increasing availability of sensor data through a variety of sensor-driven devices raises the need to exploit the data observed by sensors with the help of formally specified knowledge representations, such as the ones provided by the Semantic Web. In order to facilitate such a Semantic Sensor Web, the challenge is to bridge between symbolic knowledge representations and the measured data collected by sensors. In particular, one needs to map a given set of arbitrary sensor data to a particular set of symbolic knowledge representations, e.g. ontology instances. This task is particularly challenging due to the potential infinite variety of possible sensor measurements. Conceptual Spaces (CS) provide a means to represent knowledge in geometrical vector spaces in order to enable computation of similarities between knowledge entities by means of distance metrics. We propose an ontology for CS which allows to refine symbolic concepts as CS and to ground instances to so-called prototypical members described by vectors. By computing similarities in terms of spatial distances between a given set of sensor measurements and a finite set of prototypical members, the most similar instance can be identified. In that, we provide a means to bridge between the real-world as observed by sensors and symbolic representations. We also propose an initial implementation utilizing our approach for measurement-based Semantic Web Service discovery
Exploiting conceptual spaces for ontology integration
The widespread use of ontologies raises the need to integrate distinct conceptualisations. Whereas the symbolic approach of established representation standards – based on first-order logic (FOL) and syllogistic reasoning – does not implicitly represent semantic similarities, ontology mapping addresses this problem by aiming at establishing formal relations between a set of knowledge entities which represent the same or a similar meaning in distinct ontologies. However, manually or semi-automatically identifying similarity relationships is costly. Hence, we argue, that representational facilities are required which enable to implicitly represent similarities. Whereas Conceptual Spaces (CS) address similarity computation through the representation of concepts as vector spaces, CS rovide neither an implicit representational mechanism nor a means to represent arbitrary relations between concepts or instances. In order to overcome these issues, we propose a hybrid knowledge representation approach which extends FOL-based ontologies with a conceptual grounding through a set of CS-based representations. Consequently, semantic similarity between instances – represented as members in CS – is indicated by means of distance metrics. Hence, automatic similarity detection across distinct ontologies is supported in order to facilitate ontology integration
Towards ontology interoperability through conceptual groundings
Abstract. The widespread use of ontologies raises the need to resolve heterogeneities between distinct conceptualisations in order to support interoperability. The aim of ontology mapping is, to establish formal relations between a set of knowledge entities which represent the same or a similar meaning in distinct ontologies. Whereas the symbolic approach of established SW representation standards – based on first-order logic and syllogistic reasoning – does not implicitly represent similarity relationships, the ontology mapping task strongly relies on identifying semantic similarities. However, while concept representations across distinct ontologies hardly equal another, manually or even semi-automatically identifying similarity relationships is costly. Conceptual Spaces (CS) enable the representation of concepts as vector spaces which implicitly carry similarity information. But CS provide neither an implicit representational mechanism nor a means to represent arbitrary relations between concepts or instances. In order to overcome these issues, we propose a hybrid knowledge representation approach which extends first-order logic ontologies with a conceptual grounding through a set of CS-based representations. Consequently, semantic similarity between instances – represented as members in CS – is indicated by means of distance metrics. Hence, automatic similarity-detection between instances across distinct ontologies is supported in order to facilitate ontology mapping
On deformations of quintic and septic hypersurfaces
An old question of Mori asks whether in dimension at least three, any smooth
specialization of a hypersurface of prime degree is again a hypersurface. A
positive answer to this question is only known in degrees two and three. In
this paper, we settle the case of quintic hypersurfaces (in arbitrary
dimension) as well as the case of septics in dimension three. Our results
follow from numerical characterizations of the corresponding hypersurfaces. In
the case of quintics, this extends famous work of Horikawa who analysed
deformations of quintic surfaces.Comment: 23 pages, final version, to appear in Journal de Math\'ematiques
Pures et Appliqu\'ee
A rational deferred correction approach to parabolic optimal control problems
The accurate and efficient solution of time-dependent PDE-constrained optimization problems is a challenging task, in large part due to the very high dimension of the matrix systems that need to be solved. We devise a new deferred correction method for coupled systems of time-dependent PDEs, allowing one to iteratively improve the accuracy of low-order time stepping schemes. We consider two variants of our method, a splitting and a coupling version, and analyze their convergence properties. We then test our approach on a number of PDE-constrained optimization problems. We obtain solution accuracies far superior to that achieved when solving a single discretized problem, in particular in cases where the accuracy is limited by the time discretization. Our approach allows for the direct reuse of existing solvers for the resulting matrix systems, as well as state-of-the-art preconditioning strategies
Leverage Causes Fat Tails and Clustered Volatility
We build a simple model of leveraged asset purchases with margin calls.
Investment funds use what is perhaps the most basic financial strategy, called
"value investing", i.e. systematically attempting to buy underpriced assets.
When funds do not borrow, the price fluctuations of the asset are normally
distributed and uncorrelated across time. All this changes when the funds are
allowed to leverage, i.e. borrow from a bank, to purchase more assets than
their wealth would otherwise permit. During good times competition drives
investors to funds that use more leverage, because they have higher profits. As
leverage increases price fluctuations become heavy tailed and display clustered
volatility, similar to what is observed in real markets. Previous explanations
of fat tails and clustered volatility depended on "irrational behavior", such
as trend following. Here instead this comes from the fact that leverage limits
cause funds to sell into a falling market: A prudent bank makes itself locally
safer by putting a limit to leverage, so when a fund exceeds its leverage
limit, it must partially repay its loan by selling the asset. Unfortunately
this sometimes happens to all the funds simultaneously when the price is
already falling. The resulting nonlinear feedback amplifies large downward
price movements. At the extreme this causes crashes, but the effect is seen at
every time scale, producing a power law of price disturbances. A standard
(supposedly more sophisticated) risk control policy in which individual banks
base leverage limits on volatility causes leverage to rise during periods of
low volatility, and to contract more quickly when volatility gets high, making
these extreme fluctuations even worse.Comment: 19 pages, 8 figure
Livelihoods, growth, and links to market towns in 15 Ethiopian villages
""This paper uses longitudinal data from 15 villages in rural Ethiopia to explore the nature and consequences of these links. It addresses the following questions: (1) What are the links between rural households and local urban centers? (2) Does better access to local market towns affect household economic behavior? and (3) Does better access to local market towns make households better off? ...In our results, market towns and cities are an important source of demand for products produced in rural areas, and rural residents are a source of demand for goods sold in urban areas. Improving the presence of roads, their quality, and improved transport are important factors that willfurther bind these spaces together and improve rural welfare market towns." from Authors' AbstractRural-urban linkages ,Livelihoods ,
Cumulants of Hawkes point processes
We derive explicit, closed-form expressions for the cumulant densities of a
multivariate, self-exciting Hawkes point process, generalizing a result of
Hawkes in his earlier work on the covariance density and Bartlett spectrum of
such processes. To do this, we represent the Hawkes process in terms of a
Poisson cluster process and show how the cumulant density formulas can be
derived by enumerating all possible "family trees", representing complex
interactions between point events. We also consider the problem of computing
the integrated cumulants, characterizing the average measure of correlated
activity between events of different types, and derive the relevant equations.Comment: 11 pages, 4 figure
- …