76,327 research outputs found

    Onion Curve: A Space Filling Curve with Near-Optimal Clustering

    Get PDF
    Space filling curves (SFCs) are widely used in the design of indexes for spatial and temporal data. Clustering is a key metric for an SFC, that measures how well the curve preserves locality in moving from higher dimensions to a single dimension. We present the {\em onion curve}, an SFC whose clustering performance is provably close to optimal for the cube and near-cube shaped query sets, irrespective of the side length of the query. We show that in contrast, the clustering performance of the widely used Hilbert curve can be far from optimal, even for cube-shaped queries. Since the clustering performance of an SFC is critical to the efficiency of multi-dimensional indexes based on the SFC, the onion curve can deliver improved performance for data structures involving multi-dimensional data.Comment: The short version is published in ICDE 1

    Effectiveness of landmark analysis for establishing locality in p2p networks

    Get PDF
    Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required

    Deterministic global optimization using space-filling curves and multiple estimates of Lipschitz and Holder constants

    Get PDF
    In this paper, the global optimization problem minySF(y)\min_{y\in S} F(y) with SS being a hyperinterval in N\Re^N and F(y)F(y) satisfying the Lipschitz condition with an unknown Lipschitz constant is considered. It is supposed that the function F(y)F(y) can be multiextremal, non-differentiable, and given as a `black-box'. To attack the problem, a new global optimization algorithm based on the following two ideas is proposed and studied both theoretically and numerically. First, the new algorithm uses numerical approximations to space-filling curves to reduce the original Lipschitz multi-dimensional problem to a univariate one satisfying the H\"{o}lder condition. Second, the algorithm at each iteration applies a new geometric technique working with a number of possible H\"{o}lder constants chosen from a set of values varying from zero to infinity showing so that ideas introduced in a popular DIRECT method can be used in the H\"{o}lder global optimization. Convergence conditions of the resulting deterministic global optimization method are established. Numerical experiments carried out on several hundreds of test functions show quite a promising performance of the new algorithm in comparison with its direct competitors.Comment: 26 pages, 10 figures, 4 table

    Superfluidity of fermions with repulsive on-site interaction in an anisotropic optical lattice near a Feshbach resonance

    Full text link
    We present a numerical study on ground state properties of a one-dimensional (1D) general Hubbard model (GHM) with particle-assisted tunnelling rates and repulsive on-site interaction (positive-U), which describes fermionic atoms in an anisotropic optical lattice near a wide Feshbach resonance. For our calculation, we utilize the time evolving block decimation (TEBD) algorithm, which is an extension of the density matrix renormalization group and provides a well-controlled method for 1D systems. We show that the positive-U GHM, when hole-doped from half-filling, exhibits a phase with coexistence of quasi-long-range superfluid and charge-density-wave orders. This feature is different from the property of the conventional Hubbard model with positive-U, indicating the particle-assisted tunnelling mechanism in GHM brings in qualitatively new physics.Comment: updated with published version

    Index Information Algorithm with Local Tuning for Solving Multidimensional Global Optimization Problems with Multiextremal Constraints

    Full text link
    Multidimensional optimization problems where the objective function and the constraints are multiextremal non-differentiable Lipschitz functions (with unknown Lipschitz constants) and the feasible region is a finite collection of robust nonconvex subregions are considered. Both the objective function and the constraints may be partially defined. To solve such problems an algorithm is proposed, that uses Peano space-filling curves and the index scheme to reduce the original problem to a H\"{o}lder one-dimensional one. Local tuning on the behaviour of the objective function and constraints is used during the work of the global optimization procedure in order to accelerate the search. The method neither uses penalty coefficients nor additional variables. Convergence conditions are established. Numerical experiments confirm the good performance of the technique.Comment: 29 pages, 5 figure

    Sixteen space-filling curves and traversals for d-dimensional cubes and simplices

    Get PDF
    This article describes sixteen different ways to traverse d-dimensional space recursively in a way that is well-defined for any number of dimensions. Each of these traversals has distinct properties that may be beneficial for certain applications. Some of the traversals are novel, some have been known in principle but had not been described adequately for any number of dimensions, some of the traversals have been known. This article is the first to present them all in a consistent notation system. Furthermore, with this article, tools are provided to enumerate points in a regular grid in the order in which they are visited by each traversal. In particular, we cover: five discontinuous traversals based on subdividing cubes into 2^d subcubes: Z-traversal (Morton indexing), U-traversal, Gray-code traversal, Double-Gray-code traversal, and Inside-out traversal; two discontinuous traversals based on subdividing simplices into 2^d subsimplices: the Hill-Z traversal and the Maehara-reflected traversal; five continuous traversals based on subdividing cubes into 2^d subcubes: the Base-camp Hilbert curve, the Harmonious Hilbert curve, the Alfa Hilbert curve, the Beta Hilbert curve, and the Butz-Hilbert curve; four continuous traversals based on subdividing cubes into 3^d subcubes: the Peano curve, the Coil curve, the Half-coil curve, and the Meurthe curve. All of these traversals are self-similar in the sense that the traversal in each of the subcubes or subsimplices of a cube or simplex, on any level of recursive subdivision, can be obtained by scaling, translating, rotating, reflecting and/or reversing the traversal of the complete unit cube or simplex.Comment: 28 pages, 12 figures. v2: fixed a confusing typo on page 12, line

    Discrete denoising of heterogenous two-dimensional data

    Full text link
    We consider discrete denoising of two-dimensional data with characteristics that may be varying abruptly between regions. Using a quadtree decomposition technique and space-filling curves, we extend the recently developed S-DUDE (Shifting Discrete Universal DEnoiser), which was tailored to one-dimensional data, to the two-dimensional case. Our scheme competes with a genie that has access, in addition to the noisy data, also to the underlying noiseless data, and can employ mm different two-dimensional sliding window denoisers along mm distinct regions obtained by a quadtree decomposition with mm leaves, in a way that minimizes the overall loss. We show that, regardless of what the underlying noiseless data may be, the two-dimensional S-DUDE performs essentially as well as this genie, provided that the number of distinct regions satisfies m=o(n)m=o(n), where nn is the total size of the data. The resulting algorithm complexity is still linear in both nn and mm, as in the one-dimensional case. Our experimental results show that the two-dimensional S-DUDE can be effective when the characteristics of the underlying clean image vary across different regions in the data.Comment: 16 pages, submitted to IEEE Transactions on Information Theor

    A Study of Energy and Locality Effects using Space-filling Curves

    Full text link
    The cost of energy is becoming an increasingly important driver for the operating cost of HPC systems, adding yet another facet to the challenge of producing efficient code. In this paper, we investigate the energy implications of trading computation for locality using Hilbert and Morton space-filling curves with dense matrix-matrix multiplication. The advantage of these curves is that they exhibit an inherent tiling effect without requiring specific architecture tuning. By accessing the matrices in the order determined by the space-filling curves, we can trade computation for locality. The index computation overhead of the Morton curve is found to be balanced against its locality and energy efficiency, while the overhead of the Hilbert curve outweighs its improvements on our test system.Comment: Proceedings of the 2014 IEEE International Parallel & Distributed Processing Symposium Workshops (IPDPSW
    corecore