125,640 research outputs found

    Strong consistencies for weighted constraint satisfaction problems

    Get PDF
    Cette thèse se focalise sur l'étude de cohérences locales fortes afin de résoudre des problèmes d'optimisation sur des réseaux de fonctions de coûts (ou réseaux de contraintes pondérées). Ces méthodes fournissent le minorant nécessaire pour des approches de type "Séparation-Evaluation". Nous étudions dans un premier temps la cohérence d'Arc virtuelle (VAC), une des plus fortes cohérences d'arcs du domaine, qui est établie via l'établissement de la cohérence d'arc dure dans une séquence de réseaux de contraintes classiques. L'algorithme itératif pour établir VAC est amélioré via l'introduction d'une incrémentalité accrue, exploitant la cohérence d'arc dynamique. La nouvelle méthode est aussi capable de maintenir VAC efficacement pendant la recherche lorsque les réseaux de contraintes pondérées sont dynamiquement modifiés par les opérations de branchement. Dans une seconde partie, nous nous intéressons à des cohérences de domaines plus fortes, inspirées de cohérences similaires dans les réseaux de contraintes classiques (cohérence de chemin inverse, réduite ou Max-réduite). Pour chaque cohérence dure, plusieurs cohérences souples ont été proposées pour les réseaux de contraintes pondérées. Les nouvelles cohérences fournissent un minorant plus fort que celui des cohérences d'arc souples en traitant les triplets de variables connectées deux à deux par des fonctions de coûts binaires. Dans cette thèse, nous étudions les propriétés des nouvelles cohérences, les implémentons et les testons sur une variété de problèmes.This thesis focuses on strong local consistencies for solving optimization problems in cost function networks (or weighted constraint networks). These methods provide the lower bound necessary for Branch-and-Bound search. We first study the Virtual arc consistency, one of the strongest soft arc consistencies, which is enforced by iteratively establishing hard arc consistency in a sequence of classical Constraint Networks. The algorithm enforcing VAC is improved by integrating the dynamic arc consistency to exploit its incremental behavior. The dynamic arc consistency also allows to improve VAC when maintained VAC during search by efficiently exploiting the changes caused by branching operations. Operations. Secondly, we are interested in stronger domain-based soft consistencies, inspired from similar consistencies in hard constraint networks (path inverse consistency, restricted or Max-restricted path consistencies). From each of these hard consistencies, many soft variants have been proposed for weighted constraint networks. The new consistencies provide lower bounds stronger than soft arc consistencies by processing triplets of variables connected two-by-two by binary cost functions. We have studied the properties of these new consistencies, implemented and tested them on a variety of problems

    Sparse Recovery via Differential Inclusions

    Full text link
    In this paper, we recover sparse signals from their noisy linear measurements by solving nonlinear differential inclusions, which is based on the notion of inverse scale space (ISS) developed in applied mathematics. Our goal here is to bring this idea to address a challenging problem in statistics, \emph{i.e.} finding the oracle estimator which is unbiased and sign-consistent using dynamics. We call our dynamics \emph{Bregman ISS} and \emph{Linearized Bregman ISS}. A well-known shortcoming of LASSO and any convex regularization approaches lies in the bias of estimators. However, we show that under proper conditions, there exists a bias-free and sign-consistent point on the solution paths of such dynamics, which corresponds to a signal that is the unbiased estimate of the true signal and whose entries have the same signs as those of the true signs, \emph{i.e.} the oracle estimator. Therefore, their solution paths are regularization paths better than the LASSO regularization path, since the points on the latter path are biased when sign-consistency is reached. We also show how to efficiently compute their solution paths in both continuous and discretized settings: the full solution paths can be exactly computed piece by piece, and a discretization leads to \emph{Linearized Bregman iteration}, which is a simple iterative thresholding rule and easy to parallelize. Theoretical guarantees such as sign-consistency and minimax optimal l2l_2-error bounds are established in both continuous and discrete settings for specific points on the paths. Early-stopping rules for identifying these points are given. The key treatment relies on the development of differential inequalities for differential inclusions and their discretizations, which extends the previous results and leads to exponentially fast recovering of sparse signals before selecting wrong ones.Comment: In Applied and Computational Harmonic Analysis, 201

    Geometry of the faithfulness assumption in causal inference

    Full text link
    Many algorithms for inferring causality rely heavily on the faithfulness assumption. The main justification for imposing this assumption is that the set of unfaithful distributions has Lebesgue measure zero, since it can be seen as a collection of hypersurfaces in a hypercube. However, due to sampling error the faithfulness condition alone is not sufficient for statistical estimation, and strong-faithfulness has been proposed and assumed to achieve uniform or high-dimensional consistency. In contrast to the plain faithfulness assumption, the set of distributions that is not strong-faithful has nonzero Lebesgue measure and in fact, can be surprisingly large as we show in this paper. We study the strong-faithfulness condition from a geometric and combinatorial point of view and give upper and lower bounds on the Lebesgue measure of strong-faithful distributions for various classes of directed acyclic graphs. Our results imply fundamental limitations for the PC-algorithm and potentially also for other algorithms based on partial correlation testing in the Gaussian case.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1080 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On Profit-Maximizing Pricing for the Highway and Tollbooth Problems

    Get PDF
    In the \emph{tollbooth problem}, we are given a tree \bT=(V,E) with nn edges, and a set of mm customers, each of whom is interested in purchasing a path on the tree. Each customer has a fixed budget, and the objective is to price the edges of \bT such that the total revenue made by selling the paths to the customers that can afford them is maximized. An important special case of this problem, known as the \emph{highway problem}, is when \bT is restricted to be a line. For the tollbooth problem, we present a randomized O(logn)O(\log n)-approximation, improving on the current best O(logm)O(\log m)-approximation. We also study a special case of the tollbooth problem, when all the paths that customers are interested in purchasing go towards a fixed root of \bT. In this case, we present an algorithm that returns a (1ϵ)(1-\epsilon)-approximation, for any ϵ>0\epsilon > 0, and runs in quasi-polynomial time. On the other hand, we rule out the existence of an FPTAS by showing that even for the line case, the problem is strongly NP-hard. Finally, we show that in the \emph{coupon model}, when we allow some items to be priced below zero to improve the overall profit, the problem becomes even APX-hard
    corecore