4 research outputs found

    Factory planning through paper-based computer-aided sketching

    Get PDF
    Sketching has long served as a means to quickly express ideas in the early stages of design. Whilst CAD systems offer visualization capabilities that are not offered by a sketch, such technology is not exploited in the early stages, as it does not allow sketching as input. For this reason, Computer Aided Sketching (CAS) technology has been developed to combine the benefits of sketching with CAD. Yet, although this technology has been applied in a range of domains (such as architecture, product design, graphical user-interface design etc.), it has not yet been exploited for shop floor planning. In view of this, the research disclosed in this paper concerns the on-going development of a framework allowing users to quickly have a 3D CAD model of a factory directly from paper-based sketches of the factory. A visual language was developed such that it allows factory designers to schematically represent the shop floor, whilst at the same time facilitates off-line computer-processing of the sketches.peer-reviewe

    DISco: a Distributed Information Store for network Challenges and their Outcome

    Full text link
    We present DISco, a storage and communication middleware designed to enable distributed and task-centric autonomic control of networks. DISco is designed to enable multi-agent identification of anomalous situations -- so-called "challenges" -- and assist coordinated remediation that maintains degraded -- but acceptable -- service level, while keeping a track of the challenge evolution in order to enable human-assisted diagnosis of flaws in the network. We propose to use state-of-art peer-to-peer publish/subscribe and distributed storage as core building blocks for the DISco service

    A Thesis on Sketch-Based Techniques for Mesh Deformation and Editing

    Get PDF
    The goal of this research is to develop new and more intuitive ways for editing a mesh from a static camera angle. I present two ways to edit a mesh via a simple sketching system. The first method is a gray-scale editor which allows the user to specify a fall off function for the region being deformed. The second method is a profile editor in which the user can re-sketch a mesh’s profile. Lastly, the types of edits possible will be discussed and our results will be presented

    Touch ‘n’ sketch: pen and fingers on a multi-touch sketch application for tablet PC’s

    Get PDF
    In many creative and technical areas, professionals make use of paper sketches for developing and expressing concepts and models. Paper offers an almost constraint free environment where they have as much freedom to express themselves as they need. However, paper does have some disadvantages, such as size and not being able to manipulate the content (other than remove it or scratch it), which can be overcome by creating systems that can offer the same freedom people have from paper but none of the disadvantages and limitations. Only in recent years has the technology become massively available that allows doing precisely that, with the development in touch‐sensitive screens that also have the ability to interact with a stylus. In this project a prototype was created with the objective of finding a set of the most useful and usable interactions, which are composed of combinations of multi‐touch and pen. The project selected Computer Aided Software Engineering (CASE) tools as its application domain, because it addresses a solid and well‐defined discipline with still sufficient room for new developments. This was the result from the area research conducted to find an application domain, which involved analyzing sketching tools from several possible areas and domains. User studies were conducted using Model Driven Inquiry (MDI) to have a better understanding of the human sketch creation activities and concepts devised. Then the prototype was implemented, through which it was possible to execute user evaluations of the interaction concepts created. Results validated most interactions, in the face of limited testing only being possible at the time. Users had more problems using the pen, however handwriting and ink recognition were very effective, and users quickly learned the manipulations and gestures from the Natural User Interface (NUI).Universidade da Madeir
    corecore