72,590 research outputs found

    Weighted Mean Curvature

    Full text link
    In image processing tasks, spatial priors are essential for robust computations, regularization, algorithmic design and Bayesian inference. In this paper, we introduce weighted mean curvature (WMC) as a novel image prior and present an efficient computation scheme for its discretization in practical image processing applications. We first demonstrate the favorable properties of WMC, such as sampling invariance, scale invariance, and contrast invariance with Gaussian noise model; and we show the relation of WMC to area regularization. We further propose an efficient computation scheme for discretized WMC, which is demonstrated herein to process over 33.2 giga-pixels/second on GPU. This scheme yields itself to a convolutional neural network representation. Finally, WMC is evaluated on synthetic and real images, showing its superiority quantitatively to total-variation and mean curvature.Comment: 12 page

    On l^p norms of weighted mean matrices

    Get PDF
    We study lpl^{p} operator norms of weighted mean matrices using the approaches of Kaluza-Szeg\"o and Redheffer. As an application, we prove a conjecture of Bennett.Comment: 18 page

    A Note on l² Norms of Weighted Mean Matrices

    Get PDF
    We give a proof of Cartlidge’s result on the lp operator norms of weighted mean matrices for p = 2 on interpreting the norms as eigenvalues of certain matrices

    Weighted Mean Impact Analysis

    Get PDF
    Linear regression is a popular tool that is often applied to biometric and epidemiological data. It relies on strong model assumptions that are rarely satisfied. To overcome this difficulty, Brannath and Scharpenberg (2014) proposed a new population based interpretation of linear regression coefficients. The idea is to quantify how much the unconditional mean of the dependent variable Y can be changed by changing the distribution of the independent variable X. The maximum change is called mean impact . They show that linear regression can be used to obtain a conservative estimator of the mean impact and other population association measures. This provides a clear interpretation of the linear regression coefficients also under miss-specifications of the mean structure. A disadvantage of the new association measure is its dependence on the distribution of the independent variables in the specific study population. Hence, it may be difficult to compare the results between different studies with differing covariate distributions. To overcome this difficulty we develop a method to transfer the mean impact from one study population to a reference population by reweighting the observations. Accordingly, we call the resulting estimator the weighted mean impact . The weights are obtained by a simple transformation of the expectation of the covariates multiplied with the target variable. They are defined as the density of the covariables in the true population divided by the distribution of the covariables in a pseudopopulation. For the new developed weighted mean impact we show desirable asymptotic properties like consistency and asymptotic normality. Although the weights are unknown in practical applications we first consider the case of known weights to improve the understanding of the reweighting mechanisms. Subsequently, the approach is generalized to the case of unknown weights which need to be estimated. One application for the reweighting mechanisms is to solve the problem of confounding. In the context of the mean impact confounding arises if the covariates are dependent. To avoid confounding we transform the mean impact under dependent covariates into a mean impact under independent covariates by using the weighting factor. For this example the weights are the ratio of the marginal density of one of the covariates and the conditional density. For this reason Robins et al. (2000) proposed theseweights in the context ofmarginal structural models. For the weighted mean impact with unknown weights we show asymptotic properties, develop bootstrap confidence intervals and demonstrate the utility of the new method by examples and results from a simulation study

    Minimizing weighted mean absolute deviation of job completion times from their weighted mean

    Get PDF
    Cataloged from PDF version of article.We address a single-machine scheduling problem where the objective is to minimize the weighted mean absolute deviation of job completion times from their weighted mean. This problem and its precursors aim to achieve the maximum admissible level of service equity. It has been shown earlier that the unweighted version of this problem is NP-hard in the ordinary sense. For that version, a pseudo-polynomial time dynamic program and a 2- approximate algorithm are available. However, not much (except for an important solution property) exists for the weighted version. In this paper, we establish the relationship between the optimal solution to the weighted problem and a related one in which the deviations are measured from the weighted median (rather than the mean) of the job completion times; this generalizes the 2-approximation result mentioned above. We proceed to give a pseudo-polynomial time dynamic program, establishing the ordinary NP-hardness of the problem in general. We then present a fully-polynomial time approximation scheme as well. Finally, we report the findings from a limited computational study on the heuristic solution of the general problem. Our results specialize easily to the unweighted case; they also lead to an approximation of the set of schedules that are efficient with respect to both the weighted mean absolute deviation and the weighted mean completion time. 2011 Elsevier Inc. All rights reserved
    corecore