9,788 research outputs found

    A survey on 3D CAD model quality assurance and testing

    Get PDF
    [EN] A new taxonomy of issues related to CAD model quality is presented, which distinguishes between explicit and procedural models. For each type of model, morphologic, syntactic, and semantic errors are characterized. The taxonomy was validated successfully when used to classify quality testing tools, which are aimed at detecting and repairing data errors that may affect the simplification, interoperability, and reusability of CAD models. The study shows that low semantic level errors that hamper simplification are reasonably covered in explicit representations, although many CAD quality testers are still unaffordable for Small and Medium Enterprises, both in terms of cost and training time. Interoperability has been reasonably solved by standards like STEP AP 203 and AP214, but model reusability is not feasible in explicit representations. Procedural representations are promising, as interactive modeling editors automatically prevent most morphologic errors derived from unsuitable modeling strategies. Interoperability problems between procedural representations are expected to decrease dramatically with STEP AP242. Higher semantic aspects of quality such as assurance of design intent, however, are hardly supported by current CAD quality testers. (C) 2016 Elsevier Ltd. All rights reserved.This work was supported by the Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund, through the ANNOTA project (Ref. TIN2013-46036-C3-1-R).González-Lluch, C.; Company, P.; Contero, M.; Camba, J.; Plumed, R. (2017). A survey on 3D CAD model quality assurance and testing. Computer-Aided Design. 83:64-79. https://doi.org/10.1016/j.cad.2016.10.003S64798

    PHYSICS-AWARE MODEL SIMPLIFICATION FOR INTERACTIVE VIRTUAL ENVIRONMENTS

    Get PDF
    Rigid body simulation is an integral part of Virtual Environments (VE) for autonomous planning, training, and design tasks. The underlying physics-based simulation of VE must be accurate and computationally fast enough for the intended application, which unfortunately are conflicting requirements. Two ways to perform fast and high fidelity physics-based simulation are: (1) model simplification, and (2) parallel computation. Model simplification can be used to allow simulation at an interactive rate while introducing an acceptable level of error. Currently, manual model simplification is the most common way of performing simulation speedup but it is time consuming. Hence, in order to reduce the development time of VEs, automated model simplification is needed. The dissertation presents an automated model simplification approach based on geometric reasoning, spatial decomposition, and temporal coherence. Geometric reasoning is used to develop an accessibility based algorithm for removing portions of geometric models that do not play any role in rigid body to rigid body interaction simulation. Removing such inaccessible portions of the interacting rigid body models has no influence on the simulation accuracy but reduces computation time significantly. Spatial decomposition is used to develop a clustering algorithm that reduces the number of fluid pressure computations resulting in significant speedup of rigid body and fluid interaction simulation. Temporal coherence algorithm reuses the computed force values from rigid body to fluid interaction based on the coherence of fluid surrounding the rigid body. The simulations are further sped up by performing computing on graphics processing unit (GPU). The dissertation also presents the issues pertaining to the development of parallel algorithms for rigid body simulations both on multi-core processors and GPU. The developed algorithms have enabled real-time, high fidelity, six degrees of freedom, and time domain simulation of unmanned sea surface vehicles (USSV) and can be used for autonomous motion planning, tele-operation, and learning from demonstration applications

    On systematic approaches for interpreted information transfer of inspection data from bridge models to structural analysis

    Get PDF
    In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies

    Management of spatial data for visualization on mobile devices

    Get PDF
    Vector-based mapping is emerging as a preferred format in Location-based Services(LBS), because it can deliver an up-to-date and interactive map visualization. The Progressive Transmission(PT) technique has been developed to enable the ecient transmission of vector data over the internet by delivering various incremental levels of detail(LoD). However, it is still challenging to apply this technique in a mobile context due to many inherent limitations of mobile devices, such as small screen size, slow processors and limited memory. Taking account of these limitations, PT has been extended by developing a framework of ecient data management for the visualization of spatial data on mobile devices. A data generalization framework is proposed and implemented in a software application. This application can signicantly reduce the volume of data for transmission and enable quick access to a simplied version of data while preserving appropriate visualization quality. Using volunteered geographic information as a case-study, the framework shows exibility in delivering up-to-date spatial information from dynamic data sources. Three models of PT are designed and implemented to transmit the additional LoD renements: a full scale PT as an inverse of generalisation, a viewdependent PT, and a heuristic optimised view-dependent PT. These models are evaluated with user trials and application examples. The heuristic optimised view-dependent PT has shown a signicant enhancement over the traditional PT in terms of bandwidth-saving and smoothness of transitions. A parallel data management strategy associated with three corresponding algorithms has been developed to handle LoD spatial data on mobile clients. This strategy enables the map rendering to be performed in parallel with a process which retrieves the data for the next map location the user will require. A viewdependent approach has been integrated to monitor the volume of each LoD for visible area. The demonstration of a exible rendering style shows its potential use in visualizing dynamic geoprocessed data. Future work may extend this to integrate topological constraints and semantic constraints for enhancing the vector map visualization

    Toolpath verification using set-theoretic solid modelling

    Get PDF

    Feature-based validation reasoning for intent-driven engineering design

    Get PDF
    Feature based modelling represents the future of CAD systems. However, operations such as modelling and editing can corrupt the validity of a feature-based model representation. Feature interactions are a consequence of feature operations and the existence of a number of features in the same model. Feature interaction affects not only the solid representation of the part, but also the functional intentions embedded within features. A technique is thus required to assess the integrity of a feature-based model from various perspectives, including the functional intentional one, and this technique must take into account the problems brought about by feature interactions and operations. The understanding, reasoning and resolution of invalid feature-based models requires an understanding of the feature interaction phenomena, as well as the characterisation of these functional intentions. A system capable of such assessment is called a feature-based representation validation system. This research studies feature interaction phenomena and feature-based designer's intents as a medium to achieve a feature-based representation validation system. [Continues.

    A new modelling framework for statistical cumulus dynamics

    Get PDF
    We propose a new modelling framework suitable for the description of atmospheric convective systems as a collection of distinct plumes. The literature contains many examples of models for collections of plumes in which strong simplifying assumptions are made, a diagnostic dependence of convection on the large-scale environment and the limit of many plumes often being imposed from the outset. Some recent studies have sought to remove one or the other of those assumptions. The proposed framework removes both, and is explicitly time-dependent and stochastic in its basic character. The statistical dynamics of the plume collection are defined through simple probabilistic rules applied at the level of individual plumes, and van Kampen's system size expansion is then used to construct the macroscopic limit of the microscopic model. Through suitable choices of the microscopic rules, the model is shown to encompass previous studies in the appropriate limits, and to allow their natural extensions beyond those limits
    • …
    corecore