8 research outputs found

    Estimating View Parameters From Random Projections for Tomography Using Spherical MDS

    Get PDF
    Background During the past decade, the computed tomography has been successfully applied to various fields especially in medicine. The estimation of view angles for projections is necessary in some special applications of tomography, for example, the structuring of viruses using electron microscopy and the compensation of the patient\u27s motion over long scanning period. Methods This work introduces a novel approach, based on the spherical multidimensional scaling (sMDS), which transforms the problem of the angle estimation to a sphere constrained embedding problem. The proposed approach views each projection as a high dimensional vector with dimensionality equal to the number of sampling points on the projection. By using SMDS, then each projection vector is embedded onto a 1D sphere which parameterizes the projection with respect to view angles in a globally consistent manner. The parameterized projections are used for the final reconstruction of the image through the inverse radon transform. The entire reconstruction process is non-iterative and computationally efficient. Results The effectiveness of the sMDS is verified with various experiments, including the evaluation of the reconstruction quality from different number of projections and resistance to different noise levels. The experimental results demonstrate the efficiency of the proposed method. Conclusion Our study provides an effective technique for the solution of 2D tomography with unknown acquisition view angles. The proposed method will be extended to three dimensional reconstructions in our future work. All materials, including source code and demos, are available onhttps://engineering.purdue.edu/PRECISE/SMDS

    Natural user interfaces for engineering sketch understanding

    No full text
    Over the last decade, post-WIMP interfaces started to gain acceptance among the Engineering Design community. The increased interest in freehand sketching paved the way for creating `sketch-based interfaces\u27 with digital pen as the main tool for interaction. The first part of this thesis focused on the development of robust sketch understanding techniques that would enable pen-based interfaces to be the preferred computing platform of choice. The goal of providing greater and better affordances for natural human-computer interactions (HCI) has largely motivated computational research and technological advancements. As a result, developments of novel algorithms for better virtual interfaces and innovations in hardware technology have become mutual partners, driving each other towards making HCI more intuitive and accessible. Recent innovations like iPad™ and Kinect ™ have created a paradigm shift in how people view and interact with computing devices especially using their fingers and whole body. The second part of this thesis focuses on turning any static physical surface such as a wall or a table into a multi touch enabled interactive surface using the depth sensing camera without instrumenting the surface itself. This thesis also focuses on developing algorithms in the area of combined pen and multi touch interaction

    APIX: Analysis From Pixellated Inputs in Early Design Using a Pen-Based Interface

    No full text
    Product development is seeing a paradigm shift in the form of a simulation-driven approach. Recently, companies and designers have started to realize that simulation has the biggest impact when used as a concept verification tool in early stages of design. Early stage simulation tools like ANSYS™ Design Space and SIMULIA™ DesignSight Structure help to overcome the limitations in traditional product development processes where analyses are carried out by a separate group and not the designers. Most of these commercial tools still require well defined solid models as input and do not support freehand sketches, an integral part of the early design stage of product development. To this extent, we present APIX (acronym for Analysis from Pixellated Inputs), a tool for quick analysis of two dimensional mechanical sketches and parts from their static images using a pen-based interface. The input to the system can be offline (paper) sketches and diagrams, which include scanned legacy drawings and freehand sketches. In addition, images of two-dimensional projections of three dimensional mechanical parts can also be input. We have developed an approach to extract a set of boundary contours to represent a pixellated image using known image processing algorithms. The idea is to convert the input images to online sketches and use existing stroke-based recognition techniques for further processing. The converted sketch can now be edited, segmented, recognized, merged, solved for geometric constraints, beautified and used as input for finite element analysis. Finally, we demonstrate the effectiveness of our approach in the early design process with examples. Copyright © 2011 by ASME

    FEAsy: A Sketch-Based Tool for Finite Element Analysis

    No full text
    Freehand sketching is an integral part of early design process. Recent years have seen an increased interest in supporting sketching in computer-based design systems. In this paper, we present finite element analysis made easy (FEAsy), a naturalistic environment for static finite element analysis. This tool allows users to transform, simulate, and analyze their finite element models quickly and easily through freehand sketching. A major challenge here is to beautify freehand sketches, and to this extent, we present a domainindependent, multistroke, multiprimitive method which automatically detects and uses the spatial relationships implied in the sketches for beautification. Further, we have also developed a domain-specific rules-based algorithm for recognizing commonly used symbols in finite element analysis (FEA) and a method for identifying different contexts in finite element modeling through combined interpretation of text and geometry. The results of the user study suggest that our proposed algorithms are efficient and robust. Pilot users found the interface to be effective and easy to use.National Science Foundation (U.S.). Cyber-Human Systems (Award No. #1329979)National Science Foundation (U.S.). Cyber-Physical Systems: Synergy (Award No. #1329979)Purdue University. School of Mechanical Engineering. Donald W. Feddersen Chaired Professorshi
    corecore