15,460 research outputs found

    Inflation Targeting, Credibility and Confidence Crises

    Get PDF
    We study the interplay between the central bank transparency, its credibility, and the inflation target level. Based on a model developed in the spirit of the global games literature, we argue that whenever a weak central bank adopts a high degree of transparency and a low target level, a bad and self confirmed type of equilibrium may arise. In this case, an over-the-target inflation becomes more likely. The central bank is considered weak when favorable state of nature is required for the target to be achieved. On the other hand, if a weak central bank opts for less ambitious goals, namely lower degree of transparency and higher target level, it may avoid confidence crises and ensure a unique equilibrium for the expected inflation. Moreover, even after ruling out the possibility of confidence crises, less ambitious goals may be desirable in order to attain higher credibility and hence a better coordination of expectations. Conversely, a low target level and a high central bank transparency are desirable whenever the economy has strong fundamentals and the target can be fulfilled in many states of nature.

    Label Propagation for Learning with Label Proportions

    Get PDF
    Learning with Label Proportions (LLP) is the problem of recovering the underlying true labels given a dataset when the data is presented in the form of bags. This paradigm is particularly suitable in contexts where providing individual labels is expensive and label aggregates are more easily obtained. In the healthcare domain, it is a burden for a patient to keep a detailed diary of their daily routines, but often they will be amenable to provide higher level summaries of daily behavior. We present a novel and efficient graph-based algorithm that encourages local smoothness and exploits the global structure of the data, while preserving the `mass' of each bag.Comment: Accepted to MLSP 201

    Java Advanced Imaging API: A Tutorial

    Get PDF
    This tutorial shows how the Java language and its Java Advanced Imaging (JAI) Application Program Interface (API) can be used to create applications for image representation, processing and visualization. The Java language advantages are its low cost, licensing independence and inter-platform portability. The JAI API advantages are its flexibility and variety of image processing operators. The purpose of this tutorial is to present the basic concepts of the JAI API, including several complete and verified code samples which implements simple image processing and visualization operations. At the end of the tutorial the reader should be able to implement his/her own algorithms using the Java language and the JAI API. Keywords: Image processing, Algorithms, Java, Java Advanced Imaging

    A tale of two Bethe ans\"atze

    Full text link
    We revisit the construction of the eigenvectors of the single and double-row transfer matrices associated with the Zamolodchikov-Fateev model, within the algebraic Bethe ansatz method. The left and right eigenvectors are constructed using two different methods: the fusion technique and Tarasov's construction. A simple explicit relation between the eigenvectors from the two Bethe ans\"atze is obtained. As a consequence, we obtain the Slavnov formula for the scalar product between on-shell and off-shell Tarasov-Bethe vectors.Comment: 28 pages; v2: 30 pages, added proof of (4.40) and (5.39), minor changes to match the published versio

    Picasso : Las Meninas de Velázquez

    Get PDF

    A linear programming based heuristic framework for min-max regret combinatorial optimization problems with interval costs

    Full text link
    This work deals with a class of problems under interval data uncertainty, namely interval robust-hard problems, composed of interval data min-max regret generalizations of classical NP-hard combinatorial problems modeled as 0-1 integer linear programming problems. These problems are more challenging than other interval data min-max regret problems, as solely computing the cost of any feasible solution requires solving an instance of an NP-hard problem. The state-of-the-art exact algorithms in the literature are based on the generation of a possibly exponential number of cuts. As each cut separation involves the resolution of an NP-hard classical optimization problem, the size of the instances that can be solved efficiently is relatively small. To smooth this issue, we present a modeling technique for interval robust-hard problems in the context of a heuristic framework. The heuristic obtains feasible solutions by exploring dual information of a linearly relaxed model associated with the classical optimization problem counterpart. Computational experiments for interval data min-max regret versions of the restricted shortest path problem and the set covering problem show that our heuristic is able to find optimal or near-optimal solutions and also improves the primal bounds obtained by a state-of-the-art exact algorithm and a 2-approximation procedure for interval data min-max regret problems
    • …
    corecore