124 research outputs found

    Manhattan Cutset Sampling and Sensor Networks.

    Full text link
    Cutset sampling is a new approach to acquiring two-dimensional data, i.e., images, where values are recorded densely along straight lines. This type of sampling is motivated by physical scenarios where data must be taken along straight paths, such as a boat taking water samples. Additionally, it may be possible to better reconstruct image edges using the dense amount of data collected on lines. Finally, an advantage of cutset sampling is in the design of wireless sensor networks. If battery-powered sensors are placed densely along straight lines, then the transmission energy required for communication between sensors can be reduced, thereby extending the network lifetime. A special case of cutset sampling is Manhattan sampling, where data is recorded along evenly-spaced rows and columns. This thesis examines Manhattan sampling in three contexts. First, we prove a sampling theorem demonstrating an image can be perfectly reconstructed from Manhattan samples when its spectrum is bandlimited to the union of two Nyquist regions corresponding to the two lattices forming the Manhattan grid. An efficient ``onion peeling'' reconstruction method is provided, and we show that the Landau bound is achieved. This theorem is generalized to dimensions higher than two, where again signals are reconstructable from a Manhattan set if they are bandlimited to a union of Nyquist regions. Second, for non-bandlimited images, we present several algorithms for reconstructing natural images from Manhattan samples. The Locally Orthogonal Orientation Penalization (LOOP) algorithm is the best of the proposed algorithms in both subjective quality and mean-squared error. The LOOP algorithm reconstructs images well in general, and outperforms competing algorithms for reconstruction from non-lattice samples. Finally, we study cutset networks, which are new placement topologies for wireless sensor networks. Assuming a power-law model for communication energy, we show that cutset networks offer reduced communication energy costs over lattice and random topologies. Additionally, when solving centralized and decentralized source localization problems, cutset networks offer reduced energy costs over other topologies for fixed sensor densities and localization accuracies. Finally, with the eventual goal of analyzing different cutset topologies, we analyze the energy per distance required for efficient long-distance communication in lattice networks.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/120876/1/mprelee_1.pd

    Structure estimation for discrete graphical models: Generalized covariance matrices and their inverses

    Get PDF
    We investigate the relationship between the structure of a discrete graphical model and the support of the inverse of a generalized covariance matrix. We show that for certain graph structures, the support of the inverse covariance matrix of indicator variables on the vertices of a graph reflects the conditional independence structure of the graph. Our work extends results that have previously been established only in the context of multivariate Gaussian graphical models, thereby addressing an open question about the significance of the inverse covariance matrix of a non-Gaussian distribution. The proof exploits a combination of ideas from the geometry of exponential families, junction tree theory and convex analysis. These population-level results have various consequences for graph selection methods, both known and novel, including a novel method for structure estimation for missing or corrupted observations. We provide nonasymptotic guarantees for such methods and illustrate the sharpness of these predictions via simulations.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1162 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Lecture Notes on Network Information Theory

    Full text link
    These lecture notes have been converted to a book titled Network Information Theory published recently by Cambridge University Press. This book provides a significantly expanded exposition of the material in the lecture notes as well as problems and bibliographic notes at the end of each chapter. The authors are currently preparing a set of slides based on the book that will be posted in the second half of 2012. More information about the book can be found at http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/

    Similarity of Scenic Bilevel Images

    Full text link
    This paper has been submitted to IEEE Transaction on Image Processing in May 2015.This paper presents a study of bilevel image similarity, including new objective metrics intended to quantify similarity consistent with human perception, and a subjective experiment to obtain ground truth for judging the performance of the objective similarity metrics. The focus is on scenic bilevel images, which are complex, natural or hand-drawn images, such as landscapes or portraits. The ground truth was obtained from ratings by 77 subjects of 44 distorted versions of seven scenic images, using a modified version of the SDSCE testing methodology. Based on hypotheses about human perception of bilevel images, several new metrics are proposed that outperform existing ones in the sense of attaining significantly higher Pearson and Spearman-rank correlation coefficients with respect to the ground truth from the subjective experiment. The new metrics include Adjusted Percentage Error, Bilevel Gradient Histogram and Connected Components Comparison. Combinations of these metrics are also proposed, which exploit their complementarity to attain even better performance. These metrics and the ground truth are then used to assess the relative severity of various kinds of distortion and the performance of several lossy bilevel compression methods.http://deepblue.lib.umich.edu/bitstream/2027.42/111737/2/Similarity of Scenic Bilevel Images.pdfDescription of Similarity of Scenic Bilevel Images.pdf : Main article ("Similarity of Scenic Bilevel Images"

    Topics in Graph Theory: Extremal Intersecting Systems, Perfect Graphs, and Bireflexive Graphs

    Get PDF
    In this thesis we investigate three different aspects of graph theory. Firstly, we consider interesecting systems of independent sets in graphs, and the extension of the classical theorem of Erdos, Ko and Rado to graphs. Our main results are a proof of an Erdos-Ko-Rado type theorem for a class of trees, and a class of trees which form counterexamples to a conjecture of Hurlberg and Kamat, in such a way that extends the previous counterexamples given by Baber. Secondly, we investigate perfect graphs - specifically, edge modification aspects of perfect graphs and their subclasses. We give some alternative characterisations of perfect graphs in terms of edge modification, as well as considering the possible connection of the critically perfect graphs - previously studied by Wagler - to the Strong Perfect Graph Theorem. We prove that the situation where critically perfect graphs arise has no analogue in seven different subclasses of perfect graphs (e.g. chordal, comparability graphs), and consider the connectivity of a bipartite reconfiguration-type graph associated to each of these subclasses. Thirdly, we consider a graph theoretic structure called a bireflexive graph where every vertex is both adjacent and nonadjacent to itself, and use this to characterise modular decompositions as the surjective homomorphisms of these structures. We examine some analogues of some graph theoretic notions and define a “dual” version of the reconstruction conjecture

    GRASP/VND Optimization Algorithms for Hard Combinatorial Problems

    Get PDF
    Two hard combinatorial problems are addressed in this thesis. The first one is known as the ”Max CutClique”, a combinatorial problem introduced by P. Martins in 2012. Given a simple graph, the goal is to find a clique C such that the number of links shared between C and its complement C C is maximum. In a first contribution, a GRASP/VND methodology is proposed to tackle the problem. In a second one, the N P-Completeness of the problem is mathematically proved. Finally, a further generalization with weighted links is formally presented with a mathematical programming formulation, and the previous GRASP is adapted to the new problem. The second problem under study is a celebrated optimization problem coming from network reliability analysis. We assume a graph G with perfect nodes and imperfect links, that fail independently with identical probability ρ ∈ [0,1]. The reliability RG(ρ), is the probability that the resulting subgraph has some spanning tree. Given a number of nodes and links, p and q, the goal is to find the (p,q)-graph that has the maximum reliability RG(ρ), uniformly in the compact set ρ ∈ [0,1]. In a first contribution, we exploit properties shared by all uniformly most-reliable graphs such as maximum connectivity and maximum Kirchhoff number, in order to build a novel GRASP/VND methodology. Our proposal finds the globally optimum solution under small cases, and it returns novel candidates of uniformly most-reliable graphs, such as Kantor-Mobius and Heawood graphs. We also offer a literature review, ¨ and a mathematical proof that the bipartite graph K4,4 is uniformly most-reliable. Finally, an abstract mathematical model of Stochastic Binary Systems (SBS) is also studied. It is a further generalization of network reliability models, where failures are modelled by a general logical function. A geometrical approximation of a logical function is offered, as well as a novel method to find reliability bounds for general SBS. This bounding method combines an algebraic duality, Markov inequality and Hahn-Banach separation theorem between convex and compact sets

    Perceptual Image Similarity Metrics and Applications.

    Full text link
    This dissertation presents research in perceptual image similarity metrics and applications, e.g., content-based image retrieval, perceptual image compression, image similarity assessment and texture analysis. The first part aims to design texture similarity metrics consistent with human perception. A new family of statistical texture similarity features, called Local Radius Index (LRI), and corresponding similarity metrics are proposed. Compared to state-of-the-art metrics in the STSIM family, LRI-based metrics achieve better texture retrieval performance with much less computation. When applied to the recently developed perceptual image coder, Matched Texture Coding (MTC), they enable similar performance while significantly accelerating encoding. Additionally, in photographic paper classification, LRI-based metrics also outperform pre-existing metrics. To fulfill the needs of texture classification and other applications, a rotation-invariant version of LRI, called Rotation-Invariant Local Radius Index (RI-LRI), is proposed. RI-LRI is also grayscale and illuminance insensitive. The corresponding similarity metric achieves texture classification accuracy comparable to state-of-the-art metrics. Moreover, its much lower dimensional feature vector requires substantially less computation and storage than other state-of-the-art texture features. The second part of the dissertation focuses on bilevel images, which are images whose pixels are either black or white. The contributions include new objective similarity metrics intended to quantify similarity consistent with human perception, and a subjective experiment to obtain ground truth for judging the performance of objective metrics. Several similarity metrics are proposed that outperform existing ones in the sense of attaining significantly higher Pearson and Spearman-rank correlations with the ground truth. The new metrics include Adjusted Percentage Error, Bilevel Gradient Histogram, Connected Components Comparison and combinations of such. Another portion of the dissertation focuses on the aforementioned MTC, which is a block-based image coder that uses texture similarity metrics to decide if blocks of the image can be encoded by pointing to perceptually similar ones in the already coded region. The key to its success is an effective texture similarity metric, such as an LRI-based metric, and an effective search strategy. Compared to traditional image compression algorithms, e.g., JPEG, MTC achieves similar coding rate with higher reconstruction quality. And the advantage of MTC becomes larger as coding rate decreases.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/113586/1/yhzhai_1.pd
    corecore