19 research outputs found

    An Examplar Based Video Inpainting using Dictionary Based Method

    Get PDF
    Inpainting is a skill of rebuilding lost or selected part from the image based on relatedor available information. Reconstruction of missing parts in videos is used extensively nowadays. A method for video inpainting usingexamplar-based inpainting is introduced in the system. The examplar based inpainting samples and copies best matching texture patches using texture synthesis. Matching patches are extracted from the known part of the frames from the video. Input frames are extracted and inpainted using examplar based method. For that dictionary is maintained which consists of legal patches. The input picture isinpainted several times with different parameters. Then it is combined and details are recovered to get the final inpainted video

    Unifying Color and Texture Transfer for Predictive Appearance Manipulation

    Get PDF
    International audienceRecent color transfer methods use local information to learn the transformation from a source to an exemplar image, and then transfer this appearance change to a target image. These solutions achieve very successful results for general mood changes, e.g., changing the appearance of an image from ``sunny'' to ``overcast''. However, such methods have a hard time creating new image content, such as leaves on a bare tree. Texture transfer, on the other hand, can synthesize such content but tends to destroy image structure. We propose the first algorithm that unifies color and texture transfer, outperforming both by leveraging their respective strengths. A key novelty in our approach resides in teasing apart appearance changes that can be modeled simply as changes in color versus those that require new image content to be generated. Our method starts with an analysis phase which evaluates the success of color transfer by comparing the exemplar with the source. This analysis then drives a selective, iterative texture transfer algorithm that simultaneously predicts the success of color transfer on the target and synthesizes new content where needed. We demonstrate our unified algorithm by transferring large temporal changes between photographs, such as change of season -- e.g., leaves on bare trees or piles of snow on a street -- and flooding
    corecore