742 research outputs found

    Online Video Stream Abstraction and Stylization

    Full text link

    Perceptually Inspired Real-time Artistic Style Transfer for Video Stream

    Get PDF
    This study presents a real-time texture transfer method for artistic style transfer for video stream. We propose a parallel framework using a T-shaped kernel to enhance the computational performance. With regard to accelerated motion estimation, which is necessarily required for maintaining temporal coherence, we present a method using a downscaled motion field to successfully achieve high real-time performance for texture transfer of video stream. In addition, to enhance the artistic quality, we calculate the level of abstraction using visual saliency and integrate it with the texture transfer algorithm. Thus, our algorithm can stylize video with perceptual enhancements

    Image abstraction painting of flow-like stylization

    Get PDF
    U radu se predstavlja tehnika ne-fotorealističkog prikaza u kojoj se iz fotografije dobiva stilizirana apstraktna slika s tonovima koji se prelijevaju. Zasnovana na Kuwahara filtrima i integralnim spiralnim linijama, naša metoda simultano apstrahira oblike i boje, zadržavajući u isto vrijeme osnovna obilježja slika. Posebno razvijamo metodu proširenja i detekcije ruba i usmjeravamo pažnju na specifična obilježja i rubne dijelove slike. Predloženi je algoritam promjenljiv i iterativan te se stupanj prelijevanja tonova i apstrakcije može regulirati. Eksperimentalni rezultati pokazuju da je učinkovitost naše metode u postizanju koherentne i stilizirane apstrakcije zadovoljavajuća, uz zadržavanje osnovnih obilježja iz zadanih fotografija.This paper presents a non-photorealistic rendering technique for producing flow-like abstraction stylization from a photograph. Based on anisotropic Kuwahara filtering in conjunction with line integral convolution, our method abstracts shapes and colors simultaneously while preserving features of images. In particular, we develop an edge detection and dilation method, to draw attention to salient features and image boundaries. This proposed algorithm is incremental and iterative, and therefore the degree of flow and abstraction can be controlled. Experimental results demonstrate that the effectiveness of our method in producing a coherent and flow-like abstraction stylization is satisfactory yet preserving features and directions from photographs

    Painterly rendering techniques: A state-of-the-art review of current approaches

    Get PDF
    In this publication we will look at the different methods presented over the past few decades which attempt to recreate digital paintings. While previous surveys concentrate on the broader subject of non-photorealistic rendering, the focus of this paper is firmly placed on painterly rendering techniques. We compare different methods used to produce different output painting styles such as abstract, colour pencil, watercolour, oriental, oil and pastel. Whereas some methods demand a high level of interaction using a skilled artist, others require simple parameters provided by a user with little or no artistic experience. Many methods attempt to provide more automation with the use of varying forms of reference data. This reference data can range from still photographs, video, 3D polygonal meshes or even 3D point clouds. The techniques presented here endeavour to provide tools and styles that are not traditionally available to an artist. Copyright © 2012 John Wiley & Sons, Ltd

    A Temporally Coherent Neural Algorithm for Artistic Style Transfer

    Get PDF
    Within the fields of visual effects and animation, humans have historically spent countless painstaking hours mastering the skill of drawing frame-by-frame animations. One such animation technique that has been widely used in the animation and visual effects industry is called rotoscoping and has allowed uniquely stylized animations to capture the motion of real life action sequences, however it is a very complex and time consuming process. Automating this arduous technique would free animators from performing frame by frame stylization and allow them to concentrate on their own artistic contributions. This thesis introduces a new artificial system based on an existing neural style transfer method which creates artistically stylized animations that simultaneously reproduce both the motion of the original videos that they are derived from and the unique style of a given artistic work. This system utilizes a convolutional neural network framework to extract a hierarchy of image features used for generating images that appear visually similar to a given artistic style while at the same time faithfully preserving temporal content. The use of optical flow allows the combination of style and content to be integrated directly with the apparent motion over frames of a video to produce smooth and visually appealing transitions. The implementation described in this thesis demonstrates how biologically-inspired systems such as convolutional neural networks are rapidly approaching human-level behavior in tasks that were once thought impossible for computers. Such a complex task elucidates the current and future technical and artistic capabilities of such biologically-inspired neural systems as their horizons expand exponentially. Further, this research provides unique insights into the way that humans perceive and utilize temporal information in everyday tasks. A secondary implementation that is explored in this thesis seeks to improve existing convolutional neural networks using a biological approach to the way these models adapt to their inputs. This implementation shows how these pattern recognition systems can be greatly improved by integrating recent neuroscience research into already biologically inspired systems. Such a novel hybrid activation function model replicates recent findings in the field of neuroscience and shows significant advantages over existing static activation functions

    Can Computers Create Art?

    Full text link
    This essay discusses whether computers, using Artificial Intelligence (AI), could create art. First, the history of technologies that automated aspects of art is surveyed, including photography and animation. In each case, there were initial fears and denial of the technology, followed by a blossoming of new creative and professional opportunities for artists. The current hype and reality of Artificial Intelligence (AI) tools for art making is then discussed, together with predictions about how AI tools will be used. It is then speculated about whether it could ever happen that AI systems could be credited with authorship of artwork. It is theorized that art is something created by social agents, and so computers cannot be credited with authorship of art in our current understanding. A few ways that this could change are also hypothesized.Comment: to appear in Arts, special issue on Machine as Artist (21st Century

    Active Strokes: Coherent Line Stylization for Animated 3D Models

    Get PDF
    Paper session 8: Lines, strokes and textures in 3DInternational audienceThis paper presents a method for creating coherently animated line drawings that include strong abstraction and stylization effects. These effects are achieved with active strokes: 2D contours that approximate and track the lines of an animated 3D scene. Active strokes perform two functions: they connect and smooth unorganized line samples, and they carry coherent parameterization to support stylized rendering. Line samples are approximated and tracked using active contours ("snakes") that automatically update their arrangment and topology to match the animation. Parameterization is maintained by brush paths that follow the snakes but are independent, permitting substantial shape abstraction without compromising fidelity in tracking. This approach renders complex models in a wide range of styles at interactive rates, making it suitable for applications like games and interactive illustrations
    corecore