315 research outputs found

    Euclidean Greedy Drawings of Trees

    Full text link
    Greedy embedding (or drawing) is a simple and efficient strategy to route messages in wireless sensor networks. For each source-destination pair of nodes s, t in a greedy embedding there is always a neighbor u of s that is closer to t according to some distance metric. The existence of greedy embeddings in the Euclidean plane R^2 is known for certain graph classes such as 3-connected planar graphs. We completely characterize the trees that admit a greedy embedding in R^2. This answers a question by Angelini et al. (Graph Drawing 2009) and is a further step in characterizing the graphs that admit Euclidean greedy embeddings.Comment: Expanded version of a paper to appear in the 21st European Symposium on Algorithms (ESA 2013). 24 pages, 20 figure

    A geometric Littlewood-Richardson rule

    Full text link
    We describe an explicit geometric Littlewood-Richardson rule, interpreted as deforming the intersection of two Schubert varieties so that they break into Schubert varieties. There are no restrictions on the base field, and all multiplicities arising are 1; this is important for applications. This rule should be seen as a generalization of Pieri's rule to arbitrary Schubert classes, by way of explicit homotopies. It has a straightforward bijection to other Littlewood-Richardson rules, such as tableaux, and Knutson and Tao's puzzles. This gives the first geometric proof and interpretation of the Littlewood-Richardson rule. It has a host of geometric consequences, described in the companion paper "Schubert induction". The rule also has an interpretation in K-theory, suggested by Buch, which gives an extension of puzzles to K-theory. The rule suggests a natural approach to the open question of finding a Littlewood-Richardson rule for the flag variety, leading to a conjecture, shown to be true up to dimension 5. Finally, the rule suggests approaches to similar open problems, such as Littlewood-Richardson rules for the symplectic Grassmannian and two-flag varieties.Comment: 46 pages, 43 figure

    Integrating Local and Global Error Statistics for Multi-Scale RBF Network Training: An Assessment on Remote Sensing Data

    Get PDF
    Background This study discusses the theoretical underpinnings of a novel multi-scale radial basis function (MSRBF) neural network along with its application to classification and regression tasks in remote sensing. The novelty of the proposed MSRBF network relies on the integration of both local and global error statistics in the node selection process. Methodology and Principal Findings The method was tested on a binary classification task, detection of impervious surfaces using a Landsat satellite image, and a regression problem, simulation of waveform LiDAR data. In the classification scenario, results indicate that the MSRBF is superior to existing radial basis function and back propagation neural networks in terms of obtained classification accuracy and training-testing consistency, especially for smaller datasets. The latter is especially important as reference data acquisition is always an issue in remote sensing applications. In the regression case, MSRBF provided improved accuracy and consistency when contrasted with a multi kernel RBF network. Conclusion and Significance Results highlight the potential of a novel training methodology that is not restricted to a specific algorithmic type, therefore significantly advancing machine learning algorithms for classification and regression tasks. The MSRBF is expected to find numerous applications within and outside the remote sensing field

    A greedy genetic algorithm for the quadratic assignment problem

    Get PDF
    Cover title.Includes bibliographical references (p. 22-24).Supported in part by ONR. N00014-94-1-0099 Supported in part by UPS.by Ravindra K. Ahuja, James B. Orlin, Ashish Tiwari

    Trigonometric bases for matrix weighted Lp-spaces

    Get PDF
    AbstractWe give a complete characterization of 2π-periodic matrix weights W for which the vector-valued trigonometric system forms a Schauder basis for the matrix weighted space Lp(T;W). Then trigonometric quasi-greedy bases for Lp(T;W) are considered. Quasi-greedy bases are systems for which the simple thresholding approximation algorithm converges in norm. It is proved that such a trigonometric basis can be quasi-greedy only for p=2, and whenever the system forms a quasi-greedy basis, the basis must actually be a Riesz basis
    corecore