10 research outputs found

    The Simulation of the Brush Stroke Based on Force Feedback Technology

    Get PDF
    A novel simulation method of the brush stroke is proposed by applying force feedback technology to the virtual painting process. The relationship between force and the brush deformation is analyzed, and the spring-mass model is applied to construct the brush model, which can realistically simulate the brush morphological changes according to the force exerted on it. According to the deformation of the brush model at a sampling point, the brush footprint between the brush and the paper is calculated in real time. Then, the brush stroke is obtained by superimposing brush footprints along sampling points, and the dynamic painting of the brush stroke is implemented. The proposed method has been successfully applied to the virtual painting system based on the force feedback technology. In this system, users can implement the painting in real time with a Phantom Desktop haptic device, which can effectively enhance reality to users

    The implementation of a person tracking mobile robot.

    Get PDF
    Chan Hung-Kwan.Thesis (M.Phil.)--Chinese University of Hong Kong, 2004.Includes bibliographical references (leaves 100-101).Abstracts in English and Chinese.Abstract --- p.iAcknowledgement --- p.ivChapter 1 --- Introduction --- p.1Chapter 1.1 --- Motivation --- p.1Chapter 1.2 --- Analysis of a tracking robot system: Challenges --- p.2Chapter 1.2.1 --- Vision approach: Detecting a moving ob- ject from a moving background in real-time --- p.2Chapter 1.2.2 --- Non-vision sensor approach: The determi- nation of the angle of the target --- p.3Chapter 1.2.3 --- Emitter-and-receiver approach --- p.4Chapter 2 --- Literature Review --- p.5Chapter 2.1 --- People Detection --- p.5Chapter 2.1.1 --- Background Subtraction --- p.5Chapter 2.1.2 --- Optical Flow --- p.6Chapter 2.2 --- Target Tracking Sensors --- p.7Chapter 3 --- Hardware and Software Architecture --- p.8Chapter 3.1 --- Camera --- p.8Chapter 3.2 --- Software --- p.8Chapter 3.3 --- Hardware --- p.9Chapter 3.4 --- Interface --- p.12Chapter 3.5 --- The USB Remote Controller --- p.12Chapter 4 --- Vision --- p.17Chapter 4.1 --- Vision Challenges --- p.17Chapter 4.1.1 --- Detecting a moving object from a moving background --- p.17Chapter 4.1.2 --- High-speed in real-time --- p.19Chapter 4.2 --- Leg Tracking by Binary Image --- p.19Chapter 4.3 --- Algorithm --- p.20Chapter 4.4 --- Advantages --- p.22Chapter 4.5 --- Limitations --- p.22Chapter 4.6 --- "Estimation of the distance, d, by vision" --- p.23Chapter 4.6.1 --- A more accurate version --- p.23Chapter 4.6.2 --- Inaccuracies --- p.25Chapter 4.7 --- Future Work: Estimation of the distance by both vision sensor and ultrasonic sensor --- p.25Chapter 4.7.1 --- Ruler-based Sensor Fusion --- p.26Chapter 4.7.2 --- Learning-based Sensor Fusion --- p.27Chapter 5 --- Control --- p.28Chapter 5.1 --- Control of the Camera --- p.28Chapter 5.1.1 --- "Estimation of the Angle, Ψ" --- p.29Chapter 5.2 --- Kinematic Modeling of the Robot --- p.30Chapter 5.3 --- The Time Derivatives of d and Ψ --- p.36Chapter 5.4 --- Control of the Robot --- p.38Chapter 5.5 --- Steering Angle and Overshooting --- p.41Chapter 5.5.1 --- Steering Angle Gain --- p.41Chapter 5.5.2 --- Small Gain --- p.41Chapter 6 --- Obstacle Avoidance --- p.43Chapter 6.1 --- Ultrasonic sensor configurations --- p.45Chapter 6.2 --- Approach of Control --- p.46Chapter 6.3 --- Algorithm --- p.49Chapter 6.4 --- Robot Travelling Distance Determination --- p.50Chapter 6.5 --- Experimental Result 1 --- p.53Chapter 6.6 --- Experimental Result 2 --- p.55Chapter 6.7 --- New ideas on the system --- p.57Chapter 7 --- Tracking Sensor --- p.60Chapter 7.1 --- Possible Methods --- p.61Chapter 7.1.1 --- Magnet and Compass --- p.61Chapter 7.1.2 --- LED --- p.61Chapter 7.1.3 --- Infra-red : Door Minder --- p.62Chapter 7.2 --- Rangefinders --- p.64Chapter 7.2.1 --- Configuration --- p.65Chapter 7.2.2 --- Algorithm --- p.67Chapter 7.2.3 --- Wireless Ultrasonic Emitter-receiver Pair . --- p.68Chapter 7.2.4 --- Omni-directional Emitter --- p.74Chapter 7.2.5 --- Experiments --- p.75Chapter 7.2.6 --- Future Work --- p.79Chapter 8 --- Experiments and Performance Analysis --- p.80Chapter 8.1 --- Experiments --- p.80Chapter 8.2 --- Current Performance of the Tracking Robot --- p.85Chapter 8.3 --- Considerations on the System Speed and Subsys- tem Speeds --- p.85Chapter 8.4 --- Driving and Steering work in the same time --- p.86Chapter 8.5 --- Steering Motor --- p.87Chapter 8.5.1 --- Encoders --- p.87Chapter 8.6 --- Driving Motor --- p.87Chapter 8.6.1 --- Speed --- p.87Chapter 8.6.2 --- Speed Range --- p.87Chapter 8.7 --- Communication of the Vision Part and Control Part --- p.88Chapter 9 --- Conclusion --- p.92Chapter 9.1 --- Contributions --- p.92Chapter 9.2 --- Future Work --- p.93Chapter A --- Mobile Robot Construction --- p.97Bibliography --- p.10

    Dataremix: Aesthetic Experiences of Big Data and Data Abstraction

    Get PDF
    This PhD by published work expands on the contribution to knowledge in two recent large-scale transdisciplinary artistic research projects: ATLAS in silico and INSTRUMENT | One Antarctic Night and their exhibited and published outputs. The thesis reflects upon this practice-based artistic research that interrogates data abstraction: the digitization, datafication and abstraction of culture and nature, as vast and abstract digital data. The research is situated in digital arts practices that engage a combination of big (scientific) data as artistic material, embodied interaction in virtual environments, and poetic recombination. A transdisciplinary and collaborative artistic practice, x-resonance, provides a framework for the hybrid processes, outcomes, and contributions to knowledge from the research. These are purposefully and productively situated at the objective | subjective interface, have potential to convey multiple meanings simultaneously to a variety of audiences and resist disciplinary definition. In the course of the research, a novel methodology emerges, dataremix, which is employed and iteratively evolved through artistic practice to address the research questions: 1) How can a visceral and poetic experience of data abstraction be created? and 2) How would one go about generating an artistically-informed (scientific) discovery? Several interconnected contributions to knowledge arise through the first research question: creation of representational elements for artistic visualization of big (scientific) data that includes four new forms (genomic calligraphy, algorithmic objects as natural specimens, scalable auditory data signatures, and signal objects); an aesthetic of slowness that contributes an extension to the operative forces in Jevbratt’s inverted sublime of looking down and in to also include looking fast and slow; an extension of Corby’s objective and subjective image consisting of “informational and aesthetic components” to novel virtual environments created from big 3 (scientific) data that extend Davies’ poetic virtual spatiality to poetic objective | subjective generative virtual spaces; and an extension of Seaman’s embodied interactive recombinant poetics through embodied interaction in virtual environments as a recapitulation of scientific (objective) and algorithmic processes through aesthetic (subjective) physical gestures. These contributions holistically combine in the artworks ATLAS in silico and INSTRUMENT | One Antarctic Night to create visceral poetic experiences of big data abstraction. Contributions to knowledge from the first research question develop artworks that are visceral and poetic experiences of data abstraction, and which manifest the objective | subjective through art. Contributions to knowledge from the second research question occur through the process of the artworks functioning as experimental systems in which experiments using analytical tools from the scientific domain are enacted within the process of creation of the artwork. The results are “returned” into the artwork. These contributions are: elucidating differences in DNA helix bending and curvature along regions of gene sequences specified as either introns or exons, revealing nuanced differences in BLAST results in relation to genomics sequence metadata, and cross-correlation of astronomical data to identify putative variable signals from astronomical objects for further scientific evaluation

    Visual-based decision for iterative quality enhancement in robot drawing.

    Get PDF
    Kwok, Ka Wai.Thesis (M.Phil.)--Chinese University of Hong Kong, 2005.Includes bibliographical references (leaves 113-116).Abstracts in English and Chinese.ABSTRACT --- p.iChapter 1. --- INTRODUCTION --- p.1Chapter 1.1 --- Artistic robot in western art --- p.1Chapter 1.2 --- Chinese calligraphy robot --- p.2Chapter 1.3 --- Our robot drawing system --- p.3Chapter 1.4 --- Thesis outline --- p.3Chapter 2. --- ROBOT DRAWING SYSTEM --- p.5Chapter 2.1 --- Robot drawing manipulation --- p.5Chapter 2.2 --- Input modes --- p.6Chapter 2.3 --- Visual-feedback system --- p.8Chapter 2.4 --- Footprint study setup --- p.8Chapter 2.5 --- Chapter summary --- p.10Chapter 3. --- LINE STROKE EXTRACTION AND ORDER ASSIGNMENT --- p.11Chapter 3.1 --- Skeleton-based line trajectory generation --- p.12Chapter 3.2 --- Line stroke vectorization --- p.15Chapter 3.3 --- Skeleton tangential slope evaluation using MIC --- p.16Chapter 3.4 --- Skeleton-based vectorization using Bezier curve interpolation --- p.21Chapter 3.5 --- Line stroke extraction --- p.25Chapter 3.6 --- Line stroke order assignment --- p.30Chapter 3.7 --- Chapter summary --- p.33Chapter 4. --- PROJECTIVE RECTIFICATION AND VISION-BASED CORRECTION --- p.34Chapter 4.1 --- Projective rectification --- p.34Chapter 4.2 --- Homography transformation by selected correspondences --- p.35Chapter 4.3 --- Homography transformation using GA --- p.39Chapter 4.4 --- Visual-based iterative correction example --- p.45Chapter 4.5 --- Chapter summary --- p.49Chapter 5. --- ITERATIVE ENHANCEMENT ON OFFSET EFFECT AND BRUSH THICKNESS --- p.52Chapter 5.1 --- Offset painting effect by Chinese brush pen --- p.52Chapter 5.2 --- Iterative robot drawing process --- p.53Chapter 5.3 --- Iterative line drawing experimental results --- p.56Chapter 5.4 --- Chapter summary --- p.67Chapter 6. --- GA-BASED BRUSH STROKE GENERATION --- p.68Chapter 6.1 --- Brush trajectory representation --- p.69Chapter 6.2 --- Brush stroke modeling --- p.70Chapter 6.3 --- Stroke simulation using GA --- p.72Chapter 6.4 --- Evolutionary computing results --- p.77Chapter 6.5 --- Chapter summary --- p.95Chapter 7. --- BRUSH STROKE FOOTPRINT CHARACTERIZATION --- p.96Chapter 7.1 --- Footprint video capturing --- p.97Chapter 7.2 --- Footprint image property --- p.98Chapter 7.3 --- Experimental results --- p.102Chapter 7.4 --- Chapter summary --- p.109Chapter 8. --- CONCLUSIONS AND FUTURE WORKS --- p.111BIBLIOGRAPHY --- p.11

    Finding the Grammar of Generative Craft

    Full text link
    Art and craft design is challenging even with the assistance of computer-aided design tools. Despite the increasing availability and intelligence of software and hardware, artists continue to find gaps between their practices and tools when designing physical craft artifacts. In many craft domains, artists need to acquire domain knowledge and develop skills in design-aid tools separately. Despite their power and versatility, generic design tools pose various challenges, such as requiring workarounds for specific crafts and having steep learning curves. Compared to generic design-aid tools, craft-specific systems can offer reasonable solutions to specific design tasks because they can offer domain-specific support. Nevertheless, craft-specific tools often have limited flexibility. In this dissertation, I introduce Grammar-driven Craft Design Tools (GCDTs), which explicitly embed and utilize craft domain knowledge (i.e., ``grammar" of the craft) as their primary mechanisms and interfaces. Like other types of information, craft knowledge is processable and organizable data. In this dissertation, I develop and examine a framework to document, process, preserve, and utilize craft domain knowledge. GCDTs are craft-specific tools. By explicitly embedding and utilizing craft domain knowledge, GCDTs bridge the gap between design-aid tools and craft domain knowledge. GCDTs also have additional benefits such as supporting generative design, facilitating learning, and preserving domain knowledge. This dissertation gives an overview of how the next generation of design-aid tools can help artists find their creative expressions. It presents the GCDT framework and introduces three GCDTs developed for distinct domains. InfiniteLayer assists the design of multilayer sculpture, which is a form of sculpture made with layers of material. Then, MarkMakerSquare helps designers to invent unconventional and creative mark-making tools using various fabrication strategies. Lastly, ThreadPlotter supports the design and fabrication of plotter-based delicate punch needle embroidery.PHDInformationUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/169800/1/heslicia_1.pd

    An efficient brush model for physically-based 3D painting

    No full text
    This paper presents a novel 3D brush model consisting of a skeleton and a surface, which is deformed through constrained energy minimization. The main advantage of our model over existing ones is in its ability to mimic brush flattening and bristle spreading due to brush bending and lateral friction exerted by the paper surface during the painting process. The ability to recreate such deformations is essential to realistic 3D digital painting simulations, especially in the case of Chinese brush painting and calligraphy. To further increase realism, we also model the plasticity of wetted brushes and the resistance exerted by pores on the paper surface onto the brush tip. Our implementation runs on a consumer-level PC in real-time and produces very realistic results

    SKR1BL

    Get PDF
    corecore