18 research outputs found

    Exploring Co-creative Drawing Workflows

    Get PDF
    This article presents the outcomes from a mixed-methods study of drawing practitioners (e.g., professional illustrators, fine artists, and art students) that was conducted in Autumn 2018 as a preliminary investigation for the development of a physical human-AI co-creative drawing system. The aim of the study was to discover possible roles that technology could play in observing, modeling, and possibly assisting an artist with their drawing. The study had three components: a paper survey of artists' drawing practises, technology usage and attitudes, video recorded drawing exercises and a follow-up semi-structured interview which included a co-design discussion on how AI might contribute to their drawing workflow. Key themes identified from the interviews were (1) drawing with physical mediums is a traditional and primary way of creation; (2) artists' views on AI varied, where co-creative AI is preferable to didactic AI; and (3) artists have a critical and skeptical view on the automation of creative work with AI. Participants' input provided the basis for the design and technical specifications of a co-creative drawing prototype, for which details are presented in this article. In addition, lessons learned from conducting the user study are presented with a reflection on future studies with drawing practitioners

    The sensory materials library - AiLoupe Pecha Kucha presentation

    Get PDF
    Intelligent Design Systems for Innovation is developing AiLoupe, a mobile-app which utilises image recognition to identify, characterise and catalogue materials. It features the Royal College of Art’s Sensory Materials Library, a growing database which includes physical and sensory properties to help designers with materials selection in the product design process

    Textile Robotic Interaction for Designer-Robot Collaboration

    Get PDF
    This late-breaking report describes lab-based robot experiments involving two robot arms scanning and interaction with a set of 12 novel sustainable materials programmed with handfeel gestures inspired by how designers evaluate textile materials. The aim of gathering this data is to spur research in robot perception of soft materials and to contribute towards human-robot collaborative design systems. The complete dataset including scanned images, video of interactions accompanied by the code to produce robot motion paths are made publically available

    Workshop on human-centred AI design methods to understand “Textiles Hand”

    Get PDF
    This collaborative workshop aims to co-generate tactile-based sensorial data for AI design tools. The project teams experienced in AI design methodologies and sensory materials assessment will deliver a material centric design workshop to understand embodied and tacit knowledge of the textiles world. With the contribution of participants, the new design methods to integrate the generated data will be discussed to build on the current state-of-the-art design tools

    Human-centred AI design methods to understand intelligent systems design empowered by multisensory experience with textiles

    Get PDF
    Conceiving intelligent systems with human-robot interaction is difficult without first-hand knowledge of design tasks. This poster presents ideation results from a workshopon new forms of human-robotic collaboration primed with human experience of multisensory experiences with textiles. Twenty-five Intelligent System Design students were presented with traditional and contemporary design methods regarding the human experience of textiles, along with the latest research in AI and robotic evaluation of multisensory textile properties. Putting theory into practise, participants then engaged in individual tactile subjective assessments of a selection of fabrics with paper-based bipolar scales followed by group reflection on the process. This led into the brainstorming portion of the workshop with the prompt — How can AI collaboration in material assessments advance the applications intertwining with material tangibility? In order to have a range of ideations - participants voted on application domains and self-organised into groups with specific application focuses. The domains were (in order of popularity) – (1) Well-being and Care, (2) Gaming / Metaverse, (3) Craft, (4) Mobility Design and (5) Product Design. This poster presents a collection of ideas centred around these domains produced by the groups. The practical implications of this workshop was to demonstrate how a human-centred design process focused on multisensory experiences presented in theory and then through tactile practise can contribute towards ideation within intelligent systems design

    Predicting Artist Drawing Activity via Multi-Camera Inputs for Co-Creative Drawing

    No full text
    This paper presents the results of experimentation in computer vision based for the perception of the artist drawing with analog media (pen and paper), with the aim to contribute towards a human- robot co-creative drawing framework. Using data gathered from user studies with artists and illustrators, two types of CNN models were de- signed and evaluated to predict an artist’s activity (e.g. are they drawing or not?) and the position of the pen on the canvas based only on a multi- camera input of the drawing surface. Results of different combination of input sources are presented, with an overall mean accuracy of 95% (std: 7%) for predicting when the artist is present and 68% (std: 15%) for predicting when the artist is drawing; and mean squared normalised error of 0.0034 (std: 0.0099) of predicting the pen’s position on the drawing canvas. These results point toward an autonomous robotic system having an awareness of an artist at work via camera based input and contributes toward the development of a more fluid physical to digital workflow for creative content creation

    Calibrating with Multiple Criteria: A Demonstration of Dominance

    Get PDF
    Pattern oriented modelling (POM) is an approach to calibration or validation that assesses a model using multiple weak patterns. We extend the concept of POM, using dominance to objectively identify the best parameter candidates. The TELL ME agent-based model is used to demonstrate the approach. This model simulates personal decisions to adopt protective behaviour during an influenza epidemic. The model fit is assessed by the size and timing of maximum behaviour adoption, as well as the more usual criterion of minimising mean squared error between actual and estimated behaviour. The rigorous approach to calibration supported explicit trading off between these criteria, and ultimately demonstrated that there were significant flaws in the model structure

    AiLoupe at London Design Festival

    No full text
    Exhibiting AiLoupe for London Design Festival in an exhibition bringing together new research and innovation projects from across the Royal College of Art that focus on materials developments, climate action and the integration of artificial intelligence with design to drive innovation and sustainability. AiLoupe allows designers and material developers to discover and assess textile materials for material identification, knowledge and selection. Identifying materials takes you to each Material Data Card, showing sensory subjective data, translating the tactile, physical elements of touching the materials digitally. AiLoupe uses the Sensory Materials Library which is an AI research project for materials selection in the product design process by improving conventional materials libraries with sensory and human experience properties of materials. We have demonstrated how AiLoupe can present sustainable alternatives as ‘like for likes’ to traditional less sustainable materials that Designers are more familiar with

    AiLoupe at Source Fashion in London and Centre Stage in Hong Kong

    No full text
    Exhibiting AiLoupe at Source Fashion, the UK's new sustainable sourcing show, connecting global manufacturers and suppliers to buyers, and bringing exciting new ranges to life. AiLoupe was among other AiDLab projects, AiDA and Wise Eye. The exhibit then went to Hong Kong to be displayed at Centre Stage, an annual fashion showcase organised by the Hong Kong Trade Development Council (HKTDC). AiLoupe allows designers and material developers to discover and assess textile materials for material identification, knowledge and selection. Identifying materials takes you to each Material Data Card, showing sensory subjective data, translating the tactile, physical elements of touching the materials digitally. AiLoupe uses the Sensory Materials Library which is an AI research project for materials selection in the product design process by improving conventional materials libraries with sensory and human experience properties of materials. We have demonstrated how AiLoupe can present sustainable alternatives as ‘like for likes’ to traditional less sustainable materials that Designers are more familiar with

    AiLoupe at Fashion X AI

    No full text
    From February 2023, Fashion X AI: 2022-2023 International Salon programme will showcase diverse AI empowered design solutions via a touring exhibition with venues in Hong Kong and London. The interactive exhibition will include the diverse works of international and local creative innovation practitioners AiLoupe allows designers and material developers to discover and assess textile materials for material identification, knowledge and selection. Identifying materials takes you to each Material Data Card, showing sensory subjective data, translating the tactile, physical elements of touching the materials digitally. AiLoupe uses the Sensory Materials Library which is an AI research project for materials selection in the product design process by improving conventional materials libraries with sensory and human experience properties of materials. We have demonstrated how AiLoupe can present sustainable alternatives as ‘like for likes’ to traditional less sustainable materials that Designers are more familiar with
    corecore