5,392 research outputs found

    Painterly rendering techniques: A state-of-the-art review of current approaches

    Get PDF
    In this publication we will look at the different methods presented over the past few decades which attempt to recreate digital paintings. While previous surveys concentrate on the broader subject of non-photorealistic rendering, the focus of this paper is firmly placed on painterly rendering techniques. We compare different methods used to produce different output painting styles such as abstract, colour pencil, watercolour, oriental, oil and pastel. Whereas some methods demand a high level of interaction using a skilled artist, others require simple parameters provided by a user with little or no artistic experience. Many methods attempt to provide more automation with the use of varying forms of reference data. This reference data can range from still photographs, video, 3D polygonal meshes or even 3D point clouds. The techniques presented here endeavour to provide tools and styles that are not traditionally available to an artist. Copyright © 2012 John Wiley & Sons, Ltd

    IUPUC Spatial Innovation Lab

    Get PDF
    During the summer of 2016 the IUPUC ME Division envi-sioned the concept of an “Imagineering Lab” based largely on academic makerspace concepts. Important sub-sections of the Imagineering Lab are its “Actualization Lab” (mecha-tronics, actuators, sensors, DAQ devices etc.) and a “Spatial Innovation Lab” (SIL) based on developing “dream stations” (computer work stations) equipped with exciting new tech-nology in intuitive 2D and 3D image creation and Virtual Reality (VR) technology. The objective of the SIL is to cre-ate a work flow converting intuitively created imagery to an-imation, engineering simulation and analysis and computer driven manufacturing interfaces. This paper discusses the challenges and methods being used to create a sustainable Spatial Innovation Lab

    Artist-Configurable Node-Based Approach to Generate Procedural Brush Stroke Textures for Digital Painting

    Get PDF
    Digital painting is the field of software designed to provide artists a virtual medium to emulate the experience and results of physical drawing. Several hardware and software components come together to form a whole workflow, ranging from the physical input devices, to the stroking process, to the texture content authorship. This thesis explores an artist-friendly approach to synthesize the textures that give life to digital brush strokes. Most painting software provides a limited library of predefined brush textures. They aim to offer styles approximating physical media like paintbrushes, pencils, markers, and airbrushes. Often these are static bitmap textures that are stamped onto the canvas at repeating intervals, causing discernible repetition artifacts. When more variety is desired, artists often download commercially available brush packs that expand the library of styles. However, included and supplemental brush packs are not easily artist-customizable. In recent years, a separate field of digital art tooling has seen the popular growth of node-based procedural content generation. 3D models, shaders, and materials are commonly authored by artists using functions that can be linked together in a visual programming environment called a node graph. In this work, the feasibility is tested of using a node graph to procedurally generate highly customizable brush textures. The system synthesizes textures that adapt to parameters like pen pressure and stretch along the full length of each brush stroke instead of stamping repetitively. The result is a more flexible and artist-friendly way to define, share, and tweak brush textures used in digital painting

    Automatic painting with economized strokes

    Get PDF
    Journal ArticleWe present a method that takes a raster image as input and produces a painting-like image composed of strokes rather than pixels. Unlike previous automatic painting methods, we attempt to use very few brush-strokes. This is accomplished by first segmenting the image into features, finding the medial axes points of these features, converting the medial axes points into ordered lists of image tokens, and finally rendering these lists as brush strokes. Our process creates images reminiscent of modern realist painters who often want an abstract or sketchy quality in their work

    Artistic vision: painterly rendering using computer vision techniques

    Get PDF
    Journal ArticleWe present a method that takes a raster image as input and produces a painting-like image composed of strokes rather than pixels. Unlike previous automatic painting methods, we attempt to keep the number of brush-stroke small. This is accomplished by first segmenting the image into features, finding the medial axes points of these features, converting the medial axes points into ordered lists of image tokens, and finally rendering these lists as brush strokes. Our process creates images reminiscent of modern realist painters who often want an abstract or sketchy quality in their work

    The Simulation of the Brush Stroke Based on Force Feedback Technology

    Get PDF
    A novel simulation method of the brush stroke is proposed by applying force feedback technology to the virtual painting process. The relationship between force and the brush deformation is analyzed, and the spring-mass model is applied to construct the brush model, which can realistically simulate the brush morphological changes according to the force exerted on it. According to the deformation of the brush model at a sampling point, the brush footprint between the brush and the paper is calculated in real time. Then, the brush stroke is obtained by superimposing brush footprints along sampling points, and the dynamic painting of the brush stroke is implemented. The proposed method has been successfully applied to the virtual painting system based on the force feedback technology. In this system, users can implement the painting in real time with a Phantom Desktop haptic device, which can effectively enhance reality to users

    A viscous paint model for interactive applications

    Get PDF
    We present a viscous paint model for use in an interactive painting system based on the well-known Stokes’ equations for viscous flow. Our method is, to our knowledge, the first unconditionally stable numerical method that treats viscous fluid with a free surface boundary. We have also developed a real-time implementation of the Kubelka-Munk reflectance model for pigment mixing, compositing and rendering entirely on graphics hardware, using programmable fragment shading capabilities. We have integrated our paint model with a prototype painting system, which demonstrates the model’s effectiveness in rendering viscous paint and capturing a thick, impasto-like style of painting. Several users have tested our prototype system and were able to start creating original art work in an intuitive manner not possible with the existing techniques in commercial systems
    corecore