2,875 research outputs found

    A quantitative analysis of parametric CAD model complexity and its relationship to perceived modeling complexity

    Get PDF
    Digital product data quality and reusability has been proven a critical aspect of the Model-Based Enterprise to enable the efficient design and redesign of products. The extent to which a history-based parametric CAD model can be edited or reused depends on the geometric complexity of the part and the procedure employed to build it. As a prerequisite for defining metrics that can quantify the quality of the modeling process, it is necessary to have CAD datasets that are sorted and ranked according to the complexity of the modeling process. In this paper, we examine the concept of perceived CAD modeling complexity, defined as the degree to which a parametric CAD model is perceived as difficult to create, use, and/or modify by expert CAD designers. We present a novel method to integrate pair-wise comparisons of CAD modeling complexity made by experts into a single metric that can be used as ground truth. Next, we discuss a comprehensive study of quantitative metrics which are derived primarily from the geometric characteristics of the models and the graph structure that represents the parent/child relationships between features. Our results show that the perceived CAD modeling complexity metric derived from experts’ assessment correlates particularly strongly with graph-based metrics. The Spearman coefficients for five of these metrics suggest that they can be effectively used to study the parameters that influence the reusability of models and as a basis to implement effective personalized learning strategies in online CAD training scenarios

    Tolerance analysis and synthesis of assemblies subject to loading with process integration and design optimization tools

    Get PDF
    Manufacturing variation results in uncertainty in the functionality and performance of mechanical assemblies. Management of this uncertainty is of paramount importance for manufacturing efficiency. Methods focused on the management of uncertainty and variation in the design of mechanical assemblies, such as tolerance analysis and synthesis, have been subject to extensive research and development to date. However, due to the challenges involved, limitations in the capability of these methods remain. These limitations are associated with the following problems: The identification of Key Product Characteristics (KPCs) in mechanical assemblies (which are required for measuring functional performance) without imposing significant modelling demands.  Accommodation of the high computational cost of traditional statistical tolerance analysis in early design where analysis budgets are limited. Efficient identification of feasible regions and optimum performance within the large design spaces associated with early design stages.  The ability to comprehensively accommodate tolerance analysis problems in which assembly functionality is dependent on the effects of loading (such as compliance or multi‐body dynamics). Current Computer Aided Tolerancing (CAT) is limited by: the ability to accommodate only specific loading effects; reliance on custom simulation codes with limited practical implementation in accessible software tools; and, the need for additional expertise in formulating specific assembly tolerance models and interpreting results. Accommodation of the often impractically high computational cost of tolerance synthesis involving demanding assembly models (particularly assemblies under loading). The high computational cost is associated with traditional statistical tolerancing Uncertainty Quantification (UQ) methods reliant on low‐efficiency Monte Carlo (MC) sampling. This research is focused on addressing these limitations, by developing novel methods for enhancing the engineering design of mechanical assemblies involving uncertainty or variation in design parameters. This is achieved by utilising the emerging design analysis and refinement capabilities of Process Integration and Design Optimization (PIDO) tools. ii The main contributions of this research are in three main themes:  Design analysis and refinement accommodating uncertainty in early design;  Tolerancing of assemblies subject to loading; and, efficient Uncertainty Quantification (UQ) in tolerance analysis and synthesis. The research outcomes present a number of contributions within each research theme, as outlined below. Design analysis and refinement accommodating uncertainty in early design: A PIDO tool based visualization method to aid designers in identifying assembly KPCs in early design stages. The developed method integrates CAD software functionally with the process integration, UQ, data logging and statistical analysis capabilities of PIDO tools, to simulate manufacturing variation in an assembly and visualise assembly clearances, contacts or interferences. The visualization capability subsequently assists the designer in specifying critical assembly dimensions as KPCs.  Computationally efficient method for manufacturing sensitivity analysis of assemblies with linear‐compliant elements. Reduction in computational cost are achieved by utilising linear‐compliant assembly stiffness measures, reuse of CAD models created in early design stages, and PIDO tool based tolerance analysis. The associated increase in computational efficiency, allows an estimate of sensitivity to manufacturing variation to be made earlier in the design process with low effort.  Refinement of concept design embodiments through PIDO based DOE analysis and optimization. PIDO tools are utilised to allow CAE tool integration, and efficient reuse of models created in early design stages, to rapidly identify feasible and optimal regions in the design space. A case study focused on the conceptual design of automotive seat kinematics is presented, in which an optimal design is identified and subsequently selected for commercialisation in the Tesla Motors Model S full‐sized electric sedan. These contributions can be directly applied to improve the design of mechanical assemblies involving uncertainty or variation in design parameters in the early stages of design. The use of native CAD/E models developed as part of an established design modelling procedure imposes low additional modelling effort. Tolerancing of assemblies subject to loading:  A novel tolerance analysis platform is developed which integrates CAD/E and statistical analysis tools using PIDO tool capabilities to facilitate tolerance analysis of assemblies subject to loading. The proposed platform extends the capabilities of traditional CAT tools and methods by enabling tolerance analysis of assemblies which are dependent on iii the effects of loads. The ability to accommodate the effects of loading in tolerance analysis allows for an increased level of capability in estimating the effects of variation on functionality.  The interdisciplinary integration capabilities of the PIDO based platform allow for CAD/E models created as part of the standard design process to be used for tolerance analysis. The need for additional modelling tools and expertise is subsequently reduced.  Application of the developed platform resulted in effective solutions to practical, industry based tolerance analysis problems, including: an automotive actuator mechanism assembly consisting of rigid and compliant components subject to external forces; and a rotary switch and spring loaded radial detent assembly in which functionality is defined by external forces and internal multi‐body dynamics. In both case studies the tolerance analysis platform was applied to specify nominal dimensions and required tolerances to achieve the desired assembly yield. The computational platform offers an accessible tolerance analysis approach for accommodating assemblies subject to loading with low implementation demands. Efficient Uncertainty Quantification (UQ) in tolerance analysis and synthesis:  A novel approach is developed for addressing the high computational cost of Monte Carlo (MC) sampling in statistical tolerance analysis and synthesis, with Polynomial Chaos Expansion (PCE) uncertainty quantification. Compared to MC sampling, PCE offers significantly higher efficiency. The feasibility of PCE based UQ in tolerance synthesis is established through: theoretical analysis of the PCE method identifying working principles, implementation requirements, advantages and limitations; identification of a preferred method for determining PCE expansion coefficients in tolerance analysis; and, formulation of an approach for the validation of PCE statistical moment estimates.  PCE based UQ is subsequently implemented in a PIDO based tolerance synthesis platform for assemblies subject to loading. The resultant PIDO based tolerance synthesis platform integrates: highly efficient sparse grid based PCE UQ, parametric CAD/E models accommodating the effects of loading, cost‐tolerance modelling, yield quantification with Process Capability Indices (PCI), optimization of tolerance cost and yield with multiobjective Genetic Algorithm (GA).  To demonstrate the capabilities of the developed platform, two industry based case studies are used for validation, including: an automotive seat rail assembly consisting of compliant components subject to loading; and an automotive switch in assembly in which functionality is defined by external forces and multi‐body dynamics. In both case studies optimal tolerances were identified which satisfied desired yield and tolerance cost objectives. The addition of PCE to the tolerance synthesis platform resulted in large computational cost reductions without compromising accuracy compared to traditional MC methods. With traditional MC sampling UQ the required computational expense is impractically high. The resulting tolerance synthesis platform can be applied to tolerance analysis and synthesis with significantly reduced computation time while maintaining accurac

    Crafting chaos: computational design of contraptions with complex behaviour

    Get PDF
    The 2010s saw the democratisation of digital fabrication technologies. Although this phenomenon made fabrication more accessible, physical assemblies displaying a complex behaviour are still difficult to design. While many methods support the creation of complex shapes and assemblies, managing a complex behaviour is often assumed to be a tedious aspect of the design process. As a result, the complex parts of the behaviour are either deemed negligible (when possible) or managed directly by the software, without offering much fine-grained user control. This thesis argues that efficient methods can support designers seeking complex behaviours by increasing their level of control over these behaviours. To demonstrate this, I study two types of artistic devices that are particularly challenging to design: drawing machines, and chain reaction contraptions. These artefacts’ complex behaviour can change dramatically even as their components are moved by a small amount. The first case study aims to facilitate the exploration and progressive refinement of complex patterns generated by drawing machines under drawing-level user-defined constraints. The approach was evaluated with a user study, and several machines drawing the expected pattern were fabricated. In the second case study, I propose an algorithm to optimise the layout of complex chain reaction contraptions described by a causal graph of events in order to make them robust to uncertainty. Several machines optimised with this method were successfully assembled and run. This thesis makes the following contributions: (1) support complex behaviour specifications; (2) enable users to easily explore design variations that respect these specifications; and (3) optimise the layout of a physical assembly to maximise the probability of real-life success

    Approach for Visualization of Uncertainty in CAD-Systems Based on Ontologies

    Full text link

    A novel haptic model and environment for maxillofacial surgical operation planning and manipulation

    Get PDF
    This paper presents a practical method and a new haptic model to support manipulations of bones and their segments during the planning of a surgical operation in a virtual environment using a haptic interface. To perform an effective dental surgery it is important to have all the operation related information of the patient available beforehand in order to plan the operation and avoid any complications. A haptic interface with a virtual and accurate patient model to support the planning of bone cuts is therefore critical, useful and necessary for the surgeons. The system proposed uses DICOM images taken from a digital tomography scanner and creates a mesh model of the filtered skull, from which the jaw bone can be isolated for further use. A novel solution for cutting the bones has been developed and it uses the haptic tool to determine and define the bone-cutting plane in the bone, and this new approach creates three new meshes of the original model. Using this approach the computational power is optimized and a real time feedback can be achieved during all bone manipulations. During the movement of the mesh cutting, a novel friction profile is predefined in the haptical system to simulate the force feedback feel of different densities in the bone

    A survey on 3D CAD model quality assurance and testing

    Get PDF
    [EN] A new taxonomy of issues related to CAD model quality is presented, which distinguishes between explicit and procedural models. For each type of model, morphologic, syntactic, and semantic errors are characterized. The taxonomy was validated successfully when used to classify quality testing tools, which are aimed at detecting and repairing data errors that may affect the simplification, interoperability, and reusability of CAD models. The study shows that low semantic level errors that hamper simplification are reasonably covered in explicit representations, although many CAD quality testers are still unaffordable for Small and Medium Enterprises, both in terms of cost and training time. Interoperability has been reasonably solved by standards like STEP AP 203 and AP214, but model reusability is not feasible in explicit representations. Procedural representations are promising, as interactive modeling editors automatically prevent most morphologic errors derived from unsuitable modeling strategies. Interoperability problems between procedural representations are expected to decrease dramatically with STEP AP242. Higher semantic aspects of quality such as assurance of design intent, however, are hardly supported by current CAD quality testers. (C) 2016 Elsevier Ltd. All rights reserved.This work was supported by the Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund, through the ANNOTA project (Ref. TIN2013-46036-C3-1-R).González-Lluch, C.; Company, P.; Contero, M.; Camba, J.; Plumed, R. (2017). A survey on 3D CAD model quality assurance and testing. Computer-Aided Design. 83:64-79. https://doi.org/10.1016/j.cad.2016.10.003S64798

    Construction safety and digital design: a review

    Get PDF
    As digital technologies become widely used in designing buildings and infrastructure, questions arise about their impacts on construction safety. This review explores relationships between construction safety and digital design practices with the aim of fostering and directing further research. It surveys state-of-the-art research on databases, virtual reality, geographic information systems, 4D CAD, building information modeling and sensing technologies, finding various digital tools for addressing safety issues in the construction phase, but few tools to support design for construction safety. It also considers a literature on safety critical, digital and design practices that raises a general concern about ‘mindlessness’ in the use of technologies, and has implications for the emerging research agenda around construction safety and digital design. Bringing these strands of literature together suggests new kinds of interventions, such as the development of tools and processes for using digital models to promote mindfulness through multi-party collaboration on safet

    Parametric CAD modeling: An analysis of strategies for design reusability

    Get PDF
    CAD model quality in parametric design scenarios largely determines the level of flexibility and adaptability of a 3D model (how easy it is to alter the geometry) as well as its reusability (the ability to use existing geometry in other contexts and applications). In the context of mechanical CAD systems, the nature of the feature-based parametric modeling paradigm, which is based on parent-child interdependencies between features, allows a wide selection of approaches for creating a specific model. Despite the virtually unlimited range of possible strategies for modeling a part, only a small number of them can guarantee an appropriate internal structure which results in a truly reusable CAD model. In this paper, we present an analysis of formal CAD modeling strategies and best practices for history-based parametric design: Delphi's horizontal modeling, explicit reference modeling, and resilient modeling. Aspects considered in our study include the rationale to avoid the creation of unnecessary feature interdependencies, the sequence and selection criteria for those features, and the effects of parent/child relations on model alteration. We provide a comparative evaluation of these strategies in the form of a series of experiments using three industrial CAD models with different levels of complexity. We analyze the internal structure of the models and compare their robustness and flexibility when the geometry is modified. The results reveal significant advantages of formal modeling methodologies, particularly resilient techniques, over non-structured approaches as well as the unexpected problems of the horizontal strategy in numerous modeling situations. (C)2016 Elsevier Ltd. All rights reserved.Camba, JD.; Contero, M.; Company, P. (2016). Parametric CAD modeling: An analysis of strategies for design reusability. Computer-Aided Design. 74:18-31. doi:10.1016/j.cad.2016.01.003S18317
    corecore