287 research outputs found

    Covering problems in edge- and node-weighted graphs

    Full text link
    This paper discusses the graph covering problem in which a set of edges in an edge- and node-weighted graph is chosen to satisfy some covering constraints while minimizing the sum of the weights. In this problem, because of the large integrality gap of a natural linear programming (LP) relaxation, LP rounding algorithms based on the relaxation yield poor performance. Here we propose a stronger LP relaxation for the graph covering problem. The proposed relaxation is applied to designing primal-dual algorithms for two fundamental graph covering problems: the prize-collecting edge dominating set problem and the multicut problem in trees. Our algorithms are an exact polynomial-time algorithm for the former problem, and a 2-approximation algorithm for the latter problem, respectively. These results match the currently known best results for purely edge-weighted graphs.Comment: To appear in SWAT 201

    Dependence Logic with Generalized Quantifiers: Axiomatizations

    Full text link
    We prove two completeness results, one for the extension of dependence logic by a monotone generalized quantifier Q with weak interpretation, weak in the meaning that the interpretation of Q varies with the structures. The second result considers the extension of dependence logic where Q is interpreted as "there exists uncountable many." Both of the axiomatizations are shown to be sound and complete for FO(Q) consequences.Comment: 17 page

    A Kriging procedure for processes indexed by graphs

    Get PDF
    International audienceWe provide a new kriging procedure of processes on graphs. Based on the construction of Gaussian random processes indexed by graphs, we extend to this framework the usual linear prediction method for spatial random fields, known as kriging. We provide the expression of the estimator of such a random field at unobserved locations as well as a control for the prediction error

    A New Method to Estimate the Noise in Financial Correlation Matrices

    Full text link
    Financial correlation matrices measure the unsystematic correlations between stocks. Such information is important for risk management. The correlation matrices are known to be ``noise dressed''. We develop a new and alternative method to estimate this noise. To this end, we simulate certain time series and random matrices which can model financial correlations. With our approach, different correlation structures buried under this noise can be detected. Moreover, we introduce a measure for the relation between noise and correlations. Our method is based on a power mapping which efficiently suppresses the noise. Neither further data processing nor additional input is needed.Comment: 25 pages, 8 figure

    Complexity of Discrete Energy Minimization Problems

    Full text link
    Discrete energy minimization is widely-used in computer vision and machine learning for problems such as MAP inference in graphical models. The problem, in general, is notoriously intractable, and finding the global optimal solution is known to be NP-hard. However, is it possible to approximate this problem with a reasonable ratio bound on the solution quality in polynomial time? We show in this paper that the answer is no. Specifically, we show that general energy minimization, even in the 2-label pairwise case, and planar energy minimization with three or more labels are exp-APX-complete. This finding rules out the existence of any approximation algorithm with a sub-exponential approximation ratio in the input size for these two problems, including constant factor approximations. Moreover, we collect and review the computational complexity of several subclass problems and arrange them on a complexity scale consisting of three major complexity classes -- PO, APX, and exp-APX, corresponding to problems that are solvable, approximable, and inapproximable in polynomial time. Problems in the first two complexity classes can serve as alternative tractable formulations to the inapproximable ones. This paper can help vision researchers to select an appropriate model for an application or guide them in designing new algorithms.Comment: ECCV'16 accepte

    A Memetic Analysis of a Phrase by Beethoven: Calvinian Perspectives on Similarity and Lexicon-Abstraction

    Get PDF
    This article discusses some general issues arising from the study of similarity in music, both human-conducted and computer-aided, and then progresses to a consideration of similarity relationships between patterns in a phrase by Beethoven, from the first movement of the Piano Sonata in A flat major op. 110 (1821), and various potential memetic precursors. This analysis is followed by a consideration of how the kinds of similarity identified in the Beethoven phrase might be understood in psychological/conceptual and then neurobiological terms, the latter by means of William Calvin’s Hexagonal Cloning Theory. This theory offers a mechanism for the operation of David Cope’s concept of the lexicon, conceived here as a museme allele-class. I conclude by attempting to correlate and map the various spaces within which memetic replication occurs

    Dynamic modeling of mean-reverting spreads for statistical arbitrage

    Full text link
    Statistical arbitrage strategies, such as pairs trading and its generalizations, rely on the construction of mean-reverting spreads enjoying a certain degree of predictability. Gaussian linear state-space processes have recently been proposed as a model for such spreads under the assumption that the observed process is a noisy realization of some hidden states. Real-time estimation of the unobserved spread process can reveal temporary market inefficiencies which can then be exploited to generate excess returns. Building on previous work, we embrace the state-space framework for modeling spread processes and extend this methodology along three different directions. First, we introduce time-dependency in the model parameters, which allows for quick adaptation to changes in the data generating process. Second, we provide an on-line estimation algorithm that can be constantly run in real-time. Being computationally fast, the algorithm is particularly suitable for building aggressive trading strategies based on high-frequency data and may be used as a monitoring device for mean-reversion. Finally, our framework naturally provides informative uncertainty measures of all the estimated parameters. Experimental results based on Monte Carlo simulations and historical equity data are discussed, including a co-integration relationship involving two exchange-traded funds.Comment: 34 pages, 6 figures. Submitte
    • …
    corecore