9,170 research outputs found

    Fast Construction of Nets in Low Dimensional Metrics, and Their Applications

    Full text link
    We present a near linear time algorithm for constructing hierarchical nets in finite metric spaces with constant doubling dimension. This data-structure is then applied to obtain improved algorithms for the following problems: Approximate nearest neighbor search, well-separated pair decomposition, compact representation scheme, doubling measure, and computation of the (approximate) Lipschitz constant of a function. In all cases, the running (preprocessing) time is near-linear and the space being used is linear.Comment: 41 pages. Extensive clean-up of minor English error

    Finding a needle in an exponential haystack: Discrete RRT for exploration of implicit roadmaps in multi-robot motion planning

    Full text link
    We present a sampling-based framework for multi-robot motion planning which combines an implicit representation of a roadmap with a novel approach for pathfinding in geometrically embedded graphs tailored for our setting. Our pathfinding algorithm, discrete-RRT (dRRT), is an adaptation of the celebrated RRT algorithm for the discrete case of a graph, and it enables a rapid exploration of the high-dimensional configuration space by carefully walking through an implicit representation of a tensor product of roadmaps for the individual robots. We demonstrate our approach experimentally on scenarios of up to 60 degrees of freedom where our algorithm is faster by a factor of at least ten when compared to existing algorithms that we are aware of.Comment: Kiril Solovey and Oren Salzman contributed equally to this pape

    Angle Tree: Nearest Neighbor Search in High Dimensions with Low Intrinsic Dimensionality

    Full text link
    We propose an extension of tree-based space-partitioning indexing structures for data with low intrinsic dimensionality embedded in a high dimensional space. We call this extension an Angle Tree. Our extension can be applied to both classical kd-trees as well as the more recent rp-trees. The key idea of our approach is to store the angle (the "dihedral angle") between the data region (which is a low dimensional manifold) and the random hyperplane that splits the region (the "splitter"). We show that the dihedral angle can be used to obtain a tight lower bound on the distance between the query point and any point on the opposite side of the splitter. This in turn can be used to efficiently prune the search space. We introduce a novel randomized strategy to efficiently calculate the dihedral angle with a high degree of accuracy. Experiments and analysis on real and synthetic data sets shows that the Angle Tree is the most efficient known indexing structure for nearest neighbor queries in terms of preprocessing and space usage while achieving high accuracy and fast search time.Comment: To be submitted to IEEE Transactions on Pattern Analysis and Machine Intelligenc

    Approximate Nearest Neighbor Search for Low Dimensional Queries

    Full text link
    We study the Approximate Nearest Neighbor problem for metric spaces where the query points are constrained to lie on a subspace of low doubling dimension, while the data is high-dimensional. We show that this problem can be solved efficiently despite the high dimensionality of the data.Comment: 25 page

    Spin Glasses on the Hypercube

    Get PDF
    We present a mean field model for spin glasses with a natural notion of distance built in, namely, the Edwards-Anderson model on the diluted D-dimensional unit hypercube in the limit of large D. We show that finite D effects are strongly dependent on the connectivity, being much smaller for a fixed coordination number. We solve the non trivial problem of generating these lattices. Afterwards, we numerically study the nonequilibrium dynamics of the mean field spin glass. Our three main findings are: (i) the dynamics is ruled by an infinite number of time-sectors, (ii) the aging dynamics consists on the growth of coherent domains with a non vanishing surface-volume ratio, and (iii) the propagator in Fourier space follows the p^4 law. We study as well finite D effects in the nonequilibrium dynamics, finding that a naive finite size scaling ansatz works surprisingly well.Comment: 14 pages, 22 figure

    Identifying Real Estate Opportunities using Machine Learning

    Full text link
    The real estate market is exposed to many fluctuations in prices because of existing correlations with many variables, some of which cannot be controlled or might even be unknown. Housing prices can increase rapidly (or in some cases, also drop very fast), yet the numerous listings available online where houses are sold or rented are not likely to be updated that often. In some cases, individuals interested in selling a house (or apartment) might include it in some online listing, and forget about updating the price. In other cases, some individuals might be interested in deliberately setting a price below the market price in order to sell the home faster, for various reasons. In this paper, we aim at developing a machine learning application that identifies opportunities in the real estate market in real time, i.e., houses that are listed with a price substantially below the market price. This program can be useful for investors interested in the housing market. We have focused in a use case considering real estate assets located in the Salamanca district in Madrid (Spain) and listed in the most relevant Spanish online site for home sales and rentals. The application is formally implemented as a regression problem that tries to estimate the market price of a house given features retrieved from public online listings. For building this application, we have performed a feature engineering stage in order to discover relevant features that allows for attaining a high predictive performance. Several machine learning algorithms have been tested, including regression trees, k-nearest neighbors, support vector machines and neural networks, identifying advantages and handicaps of each of them.Comment: 24 pages, 13 figures, 5 table
    corecore