31,615 research outputs found

    Consistent relaxation matching for handwritten Chinese character recognition

    Get PDF
    Due to the complexity in structure and the various distortions (translation, rotation, shifting, and deformation) in different writing styles of Handwritten Chinese Characters(HCCs), it is more suitable to use a structural matching algorithm for computer recognition of HCC. Relaxation matching is a powerful technique which can tolerate considerable distortion. However, most relaxation techniques so far developed for Handwritten Chinese Character Recognition (HCCR) are based on a probabilistic relaxation scheme. In this paper, based on local constraint of relaxation labelling and optimization theory, we apply a new relaxation matching technique to handwritten character recognition. From the properties of the compatibility constraints, several rules are devised to guide the design of the compatibility function, which plays an important role in the relaxation process. By parallel use of local contextual information of geometric relaxationship among strokes of two characters, the ambiguity between them can be relaxed iteratively to achieve optimal consistent matching.published_or_final_versio

    A Relaxation Scheme for Mesh Locality in Computer Vision.

    Get PDF
    Parallel processing has been considered as the key to build computer systems of the future and has become a mainstream subject in Computer Science. Computer Vision applications are computationally intensive that require parallel approaches to exploit the intrinsic parallelism. This research addresses this problem for low-level and intermediate-level vision problems. The contributions of this dissertation are a unified scheme based on probabilistic relaxation labeling that captures localities of image data and the ability of using this scheme to develop efficient parallel algorithms for Computer Vision problems. We begin with investigating the problem of skeletonization. The technique of pattern match that exhausts all the possible interaction patterns between a pixel and its neighboring pixels captures the locality of this problem, and leads to an efficient One-pass Parallel Asymmetric Thinning Algorithm (OPATA\sb8). The use of 8-distance in this algorithm, or chessboard distance, not only improves the quality of the resulting skeletons, but also improves the efficiency of the computation. This new algorithm plays an important role in a hierarchical route planning system to extract high level typological information of cross-country mobility maps which greatly speeds up the route searching over large areas. We generalize the neighborhood interaction description method to include more complicated applications such as edge detection and image restoration. The proposed probabilistic relaxation labeling scheme exploit parallelism by discovering local interactions in neighboring areas and by describing them effectively. The proposed scheme consists of a transformation function and a dictionary construction method. The non-linear transformation function is derived from Markov Random Field theory. It efficiently combines evidences from neighborhood interactions. The dictionary construction method provides an efficient way to encode these localities. A case study applies the scheme to the problem of edge detection. The relaxation step of this edge-detection algorithm greatly reduces noise effects, gets better edge localization such as line ends and corners, and plays a crucial rule in refining edge outputs. The experiments on both synthetic and natural images show that our algorithm converges quickly, and is robust in noisy environment

    Lifted Relax, Compensate and then Recover: From Approximate to Exact Lifted Probabilistic Inference

    Full text link
    We propose an approach to lifted approximate inference for first-order probabilistic models, such as Markov logic networks. It is based on performing exact lifted inference in a simplified first-order model, which is found by relaxing first-order constraints, and then compensating for the relaxation. These simplified models can be incrementally improved by carefully recovering constraints that have been relaxed, also at the first-order level. This leads to a spectrum of approximations, with lifted belief propagation on one end, and exact lifted inference on the other. We discuss how relaxation, compensation, and recovery can be performed, all at the firstorder level, and show empirically that our approach substantially improves on the approximations of both propositional solvers and lifted belief propagation.Comment: Appears in Proceedings of the Twenty-Eighth Conference on Uncertainty in Artificial Intelligence (UAI2012

    Secure Layered Transmission in Multicast Systems with Wireless Information and Power Transfer

    Full text link
    This paper considers downlink multicast transmit beamforming for secure layered transmission systems with wireless simultaneous information and power transfer. We study the power allocation algorithm design for minimizing the total transmit power in the presence of passive eavesdroppers and energy harvesting receivers. The algorithm design is formulated as a non-convex optimization problem. Our problem formulation promotes the dual use of energy signals in providing secure communication and facilitating efficient energy transfer. Besides, we take into account a minimum required power for energy harvesting at the idle receivers and heterogeneous quality of service (QoS) requirements for the multicast video receivers. In light of the intractability of the problem, we reformulate the considered problem by replacing a non-convex probabilistic constraint with a convex deterministic constraint. Then, a semidefinite programming relaxation (SDR) approach is adopted to obtain an upper solution for the reformulated problem. Subsequently, sufficient conditions for the global optimal solution of the reformulated problem are revealed. Furthermore, we propose two suboptimal power allocation schemes based on the upper bound solution. Simulation results demonstrate the excellent performance and significant transmit power savings achieved by the proposed schemes compared to isotropic energy signal generation.Comment: 7 pages, 3 figures, accepted for presentation at the IEEE International Conference on Communications (ICC), Sydney, Australia, 201

    A probabilistic algorithm approximating solutions of a singular PDE of porous media type

    Get PDF
    The object of this paper is a one-dimensional generalized porous media equation (PDE) with possibly discontinuous coefficient β\beta, which is well-posed as an evolution problem in L1(R)L^1(\mathbb{R}). In some recent papers of Blanchard et alia and Barbu et alia, the solution was represented by the solution of a non-linear stochastic differential equation in law if the initial condition is a bounded integrable function. We first extend this result, at least when β\beta is continuous and the initial condition is only integrable with some supplementary technical assumption. The main purpose of the article consists in introducing and implementing a stochastic particle algorithm to approach the solution to (PDE) which also fits in the case when β\beta is possibly irregular, to predict some long-time behavior of the solution and in comparing with some recent numerical deterministic techniques

    How to make unforgeable money in generalised probabilistic theories

    Get PDF
    We discuss the possibility of creating money that is physically impossible to counterfeit. Of course, "physically impossible" is dependent on the theory that is a faithful description of nature. Currently there are several proposals for quantum money which have their security based on the validity of quantum mechanics. In this work, we examine Wiesner's money scheme in the framework of generalised probabilistic theories. This framework is broad enough to allow for essentially any potential theory of nature, provided that it admits an operational description. We prove that under a quantifiable version of the no-cloning theorem, one can create physical money which has an exponentially small chance of being counterfeited. Our proof relies on cone programming, a natural generalisation of semidefinite programming. Moreover, we discuss some of the difficulties that arise when considering non-quantum theories.Comment: 27 pages, many diagrams. Comments welcom
    • …
    corecore