355 research outputs found

    Soft Guessing Under Log-Loss Distortion Allowing Errors

    Full text link
    This paper deals with the problem of soft guessing under log-loss distortion (logarithmic loss) that was recently investigated by [Wu and Joudeh, IEEE ISIT, pp. 466--471, 2023]. We extend this problem to soft guessing allowing errors, i.e., at each step, a guesser decides whether to stop the guess or not with some probability and if the guesser stops guessing, then the guesser declares an error. We show that the minimal expected value of the cost of guessing under the constraint of the error probability is characterized by smooth R\'enyi entropy. Furthermore, we carry out an asymptotic analysis for a stationary and memoryless source

    Hypergraph pp-Laplacian: A Differential Geometry View

    Full text link
    The graph Laplacian plays key roles in information processing of relational data, and has analogies with the Laplacian in differential geometry. In this paper, we generalize the analogy between graph Laplacian and differential geometry to the hypergraph setting, and propose a novel hypergraph pp-Laplacian. Unlike the existing two-node graph Laplacians, this generalization makes it possible to analyze hypergraphs, where the edges are allowed to connect any number of nodes. Moreover, we propose a semi-supervised learning method based on the proposed hypergraph pp-Laplacian, and formalize them as the analogue to the Dirichlet problem, which often appears in physics. We further explore theoretical connections to normalized hypergraph cut on a hypergraph, and propose normalized cut corresponding to hypergraph pp-Laplacian. The proposed pp-Laplacian is shown to outperform standard hypergraph Laplacians in the experiment on a hypergraph semi-supervised learning and normalized cut setting.Comment: Extended version of our AAAI-18 pape

    Cumulant Generating Function of Codeword Lengths in Variable-Length Lossy Compression Allowing Positive Excess Distortion Probability

    Full text link
    This paper considers the problem of variable-length lossy source coding. The performance criteria are the excess distortion probability and the cumulant generating function of codeword lengths. We derive a non-asymptotic fundamental limit of the cumulant generating function of codeword lengths allowing positive excess distortion probability. It is shown that the achievability and converse bounds are characterized by the R\'enyi entropy-based quantity. In the proof of the achievability result, the explicit code construction is provided. Further, we investigate an asymptotic single-letter characterization of the fundamental limit for a stationary memoryless source.Comment: arXiv admin note: text overlap with arXiv:1701.0180

    Variable-Length Intrinsic Randomness Allowing Positive Value of the Average Variational Distance

    Full text link
    This paper considers the problem of variable-length intrinsic randomness. We propose the average variational distance as the performance criterion from the viewpoint of a dual relationship with the problem formulation of variable-length resolvability. Previous study has derived the general formula of the ϵ\epsilon-variable-length resolvability. We derive the general formula of the ϵ\epsilon-variable-length intrinsic randomness. Namely, we characterize the supremum of the mean length under the constraint that the value of the average variational distance is smaller than or equal to a constant ϵ\epsilon. Our result clarifies a dual relationship between the general formula of ϵ\epsilon-variable-length resolvability and that of ϵ\epsilon-variable-length intrinsic randomness. We also derive a lower bound of the quantity characterizing our general formula

    Marginal Probability-Based Integer Handling for CMA-ES Tackling Single-and Multi-Objective Mixed-Integer Black-Box Optimization

    Full text link
    This study targets the mixed-integer black-box optimization (MI-BBO) problem where continuous and integer variables should be optimized simultaneously. The CMA-ES, our focus in this study, is a population-based stochastic search method that samples solution candidates from a multivariate Gaussian distribution (MGD), which shows excellent performance in continuous BBO. The parameters of MGD, mean and (co)variance, are updated based on the evaluation value of candidate solutions in the CMA-ES. If the CMA-ES is applied to the MI-BBO with straightforward discretization, however, the variance corresponding to the integer variables becomes much smaller than the granularity of the discretization before reaching the optimal solution, which leads to the stagnation of the optimization. In particular, when binary variables are included in the problem, this stagnation more likely occurs because the granularity of the discretization becomes wider, and the existing integer handling for the CMA-ES does not address this stagnation. To overcome these limitations, we propose a simple integer handling for the CMA-ES based on lower-bounding the marginal probabilities associated with the generation of integer variables in the MGD. The numerical experiments on the MI-BBO benchmark problems demonstrate the efficiency and robustness of the proposed method. Furthermore, in order to demonstrate the generality of the idea of the proposed method, in addition to the single-objective optimization case, we incorporate it into multi-objective CMA-ES and verify its performance on bi-objective mixed-integer benchmark problems.Comment: Camera-ready version for ACM Transactions on Evolutionary Learning and Optimization (TELO). This paper is an extended version of the work presented in arXiv:2205.1348
    corecore