3 research outputs found

    Structurally Robust Weak Continuity

    Get PDF
    Building on earlier work, we pose the following optimization: Given a sequence of finite extent, find a finite-alphabet sequence of finite extent, which satisfies a hard structural (syntactic) constraint (e.g., it is piecewise constant of plateau run-length > M, or locally monotonic of a given lomo-degree), and which minimizes the sum of a per-letter fidelity measure, and a first-order smoothness-complexity measure. This optimization represents the unification and outgrowth of several digital nonlinear filtering schemes, including the digital counterpart of the so-called Weak Continuity (WC) formulation of Mumford-Shah and Blake-Zisserman, the Minimum Description Length (MDL) approach of Leclerc, and previous work by the first author in so- called VORCA filtering and Digital Locally Monotonic Regression. It is shown that the proposed optimization admits efficient Viterbi-type solution, and overcomes a shortcoming of WC, while preserving its unique strengths. Similarly, it overcomes a drawback of VORCA and Digital Locally Monotonic Regression, while maintaining robustness to outliers.<P

    Fast Digital Locally Monotonic Regression

    No full text
    In [1], Restrepo and Bovik developed an elegant mathematical framework in which they studied locally monotonic regressions in RN . The drawback is that the complexity of their algorithms is exponential in N. In this paper, we consider digital locally monotonic regressions, in which the output symbols are drawn from a finite alphabet, and, by making a connection to Viterbi decoding, provide a fast O(|A|2 aN) algorithm that computes any such regression, where |A| is the size of the digital output alphabet, a stands for lomo-degree, and N is sample size. This is linear in N , and it renders the technique applicable in practice
    corecore