9 research outputs found

    Towards the automation of product geometric verification: An overview

    Get PDF
    The paper aims at providing an overview on the current automation level of geometric verification process with reference to some aspects that can be considered crucial to achieve a greater efficiency, accuracy and repeatability of the inspection process. Although we are still far from making this process completely automatic, several researches were made in recent years to support and speed up the geometric error evaluation and to make it less human-intensive. The paper, in particular, surveys: (1) models of specification developed for an integrated approach to tolerancing; (2) state of the art of Computer-Aided Inspection Planning (CAIP); (3) research efforts recently made for limiting or eliminating the human contribution during the data processing aimed at geometric error evaluation. Possible future perspectives of the research on the automation of geometric verification process are finally described

    Feature Cluster Algebra and Its Application for Geometric Tolerancing

    Get PDF
    abstract: The goal of this research project is to develop a DOF (degree of freedom) algebra for entity clusters to support tolerance specification, validation, and tolerance automation. This representation is required to capture the relation between geometric entities, metric constraints and tolerance specification. This research project is a part of an on-going project on creating a bi-level model of GD&T; (Geometric Dimensioning and Tolerancing). This thesis presents the systematic derivation of degree of freedoms of entity clusters corresponding to tolerance classes. The clusters can be datum reference frames (DRFs) or targets. A binary vector representation of degree of freedom and operations for combining them are proposed. An algebraic method is developed by using DOF representation. The ASME Y14.5.1 companion to the Geometric Dimensioning and Tolerancing (GD&T;) standard gives an exhaustive tabulation of active and invariant degrees of freedom (DOF) for Datum Reference Frames (DRF). This algebra is validated by checking it against all cases in the Y14.5.1 tabulation. This algebra allows the derivation of the general rules for tolerance specification and validation. A computer tool is implemented to support GD&T; specification and validation. The computer implementation outputs the geometric and tolerance information in the form of a CTF (Constraint-Tolerance-Feature) file which can be used for tolerance stack analysis.Dissertation/ThesisM.S. Mechanical Engineering 201

    Optimisation du processus du contrôle métrologique des pièces mécaniques dans le cadre d’une analyse 3D du tolérancement

    Get PDF
    As part of the design and manufacturing of industrial products, we mainly find the steps of design, manufacture and control. The designer choose functional tolerancing values and specification for a part closely with manufacturing processes. The step of the control of parts is then an essential stage of the validation of the product made with respect to the needs of the designer. The research subject is placed in order to optimize process control parts. This optimization should be done at several levels: - In terms of costs to the measurement process: when an operator sets its control methodology, he validates if possible the entire piece on a unique way to measure it, even if the measurement device selected is not the most appropriate. This part of the study concerns the concepts of capability of measuring devices and calculation methods. It will define the limits of each machine. The analysis of the adequacy between the tolerancing specifications and the measurement tools also allows the controller to define a strategy for economic measurement (time, equipment, ...). - In terms of technical innovations to be taken into account: we will use simultaneously a finite element software, (Abaqus, for modeling the deformations of parts during the control process) and a software for tolerances analysis, (3DCS, to validate the effects of the dispersions of the measurement device over the validity of the result of the measure : measurement uncertainty).Dans le cadre de la conception et de la fabrication des produits industriels, on retrouve principalement les étapes de conception, fabrication et contrôle. Le concepteur défini un tolérancement fonctionnel des pièces en lien étroit avec les procédés de fabrication. L’étape du contrôle des pièces est alors l’étape primordiale permettant la validation du produit fabriqué vis-à-vis des besoins du concepteur. Le sujet de recherche se place dans une optique d’optimisation du processus de contrôle des pièces. Cette étude d’optimisation du processus de contrôle devra se faire à plusieurs niveaux : - Au niveau des coûts liés à l’opération de mesure : lorsqu’un opérateur définit sa gamme de contrôle, il valide si possible l’ensemble de la pièce sur un unique moyen de mesure, même si le moyen de mesure choisi n’est pas le plus adapté. Cette partie de l’étude concerne les concepts de capabilité des moyens de mesures, tant matériels que logiciels. Elle permettra de définir les limites de chaque appareil. L’analyse poussée de l’adéquation entre le tolérancement et les matériels de mesure permet également au contrôleur de définir une stratégie économique de mesure (temps, matériel, …). - Au niveau des innovations techniques à prendre en compte : l’utilisation couplée d’un logiciel de calcul par éléments finis : Abaqus (permettant de modéliser les déformations des pièces lors de leur conformation sur le montage de contrôle), et d’un logiciel de traitement du tolérancement : 3DCS (permettant de valider les effets des dispersions des moyens de contrôle sur le résultat de la mesure : incertitude de mesure)

    Regenerisanje NC koda primenom 3D identifikacije i analize geometrijskih odstupanja

    Get PDF
    REZIME: Proizvodnja delova na numerički upravljanim mašinama alatkama jedna je od najzastupljenijih tehnika korišćenih u savremenim proizvodnim procesima. Čak i kod preciznog hladnog i toplog oblikovanja delova ili livenja delova, mnogi od završnih postupaka obrade i delova se obavljaju rezanjem na mašinama alatkama, a takođe se i alati za ove mašine, ovim tehnologijama obrađuju. CAD/CAM sistemi mogu da generišu NC programe na osnovu geometrije dela, ali ne pomažu programerima u izboru odgovarajućih strategija rezanja, geometrije alata i drugih parametara procesa. Komercijalni CAM softveri generišu CNC programe uglavnom sa matematičke tačke gledišta i obično sadrže konstantne parametre rezanja za određenu putanju alata i ne simuliraju očekivani kvalitet i tačnost obrađenih površina, a takođe ne uzimaju u obzir mehaničke aspekte procesa glodanja kao što su sile rezanja i devijacija reznog alata. Savremena merna oprema je sve prisutnija u mašinskoj industriji, pa se može reći da su koordinatne merne mašine postale standard i da se proizvodnja i dokumentacija velikim delom prilagođava koordinatnim mernim mašinama. Sa druge strane se još intenzivnije razvijaju beskontaktni (optički i laserski) merni sistemi. Ovi merni sistemi podrazumevaju kao razultat merenja oblak tačaka, koji zahteva procesiranje i generisanje površina u cilju merenja i kontrole. U okviru disertacije dat je pregled savremene merne opreme i predstavljeni su postupci generisanja površina i merenja odstupanja pomoću softvera, kao što su CATIA, GOM Inspect, PC DMIS i sl. Eksperimentalna istraživanja obuhvataju veoma veliki broj eksperimenata. U okviru prvog eksperimenta analiziran je uticaj širine rezanja na tačnost obrađenog dela. Sledećim eksperimentom obuhvaćen je uticaj brzine rezanja i pomoćnog kretanja na hrapavost obrađenog dela kao i na dimenzionalna i geometrijska odstupanja: ravnost, upravnost i paralelnost. Narednim eksperimentima istraživan je uticaj greške postavljanja (paralelne pomerenosti ose i nagnutosti ose) glodala kao i radijalnog odstupanja zuba glodala, na topografiju površine. Poslednjim eksperimentalnim ispitivanjima određivan je uticaj promene dubine rezanja, odnosno putanje glodala na dobijene dužinske i geometrijske mere. Realizovana eksperimentalna istraživanja su pokazala korelativnu zavisnost između parametara obrade i dobijenih parametara kvaliteta obrađenog dela. Odstupanja obrađenih delova identifikovana su i merena pomoću koordinatne merne mašine i različitih optičkih mernih sistema (ATOS) uz primenu odgovarajućih softvera za merenje. Takođe su izvršena poređenja dobijenih rezultata sa različitih mernih uređaja. Razvijen je algoritam za regenerisanje postojećeg NC koda u cilju korekcije odstupanja obrađenog dela koje se javljaju usled grešaka obrade. Algoritam predviđa postojanje baze podataka koje se potrepljuju eksperimentalnim istraživanjima, podacima iz obradnih procesa, rezultatima modeliranja i simulacija zavisnosti parametara obrade od tačnosti obrade. Regenerisanjem NC koda vrše se korekcije grešaka obrade, odnosno predupređuju se uzroci nastanka grešaka obrade.ABSTRACT: The production of parts using numerically controlled machine tools is one of the most used techniques in modern production processes. Even at the precise cold and hot shaping of parts or casting of parts, most of fine machining and parts are performed with cutting on machine tools, and the tools for these machines are also processed with these technologies. CAD/CAM systems can generate NC programs based on the geometry of a part, but do not help programmers in the selection of the proper cutting strategies, tool geometry and other processing parameters. Commercial CAM software develop CNC programs mostly from the mathematical point of view and commonly contain constant cutting parameters for a specific tool path and they do not simulate the expected quality and accuracy of the machined surfaces, and they also do not take into consideration the mechanical aspects of milling process such as cutting forces and cutting tool deviation. Modern measurement equipment is more present in manufacturing industries, so it can be said that the measuring machine coordinates have became the standard and that the manufacturing and documentation are adjusting to the coordinate measuring machines in a great deal. On the other hand, the non-contact (optical and laser) measuring systems are more intensively developed. These measuring systems as a result include a point cloud, which requires the processing and generating of surfaces for the purpose of measuring and control. In the framework of the dissertation is given the review of modern measuring equipment and the surface generating procedures and deviation measurements using the software, such as CATIA, Gom Inspect, PC DMIS, etc, are presented. Experimental researches include a very large number of experiments. Within the firs experiment, the impact of cutting width on the accuracy of processed part was analyzed. The following experiment included the impact of cutting speed and auxiliary movement of the roughness of the processed part as well as on the dimensional and geometrical deviations: flatness, perpendicularity and parallelism. Subsequent experiments examined the impact of error setup (parallel offset of axis and inclination of the axis) of the milling cutter as well as the radial deviation of mill tooth on the surface topography. In the final experiments the impact of the change of cutting depth, i.e., milling cutter path, on the obtained longitudinal and geometrical measures is determined. Conducted experimental researches have shown the correlative dependence between the cutting parameters and obtained quality parameters of machined part. Deviations of machined parts are identified and measured using the coordinate measuring machine and different optical measuring systems (ATOS) with the application of corresponding measuring software. The comparisons of the results obtained from the different measuring devices are also performed. An algorithm for generation of the existing NC code for the correction of the machined part deviations that occur due to machining error is developed. The algorithm predicts the existence of database that is supported by experimental research, data from machining processes, results of modeling and simulation of machining parameters dependence on the accuracy of the processing. Regeneration of NC code By regenerating the NC code the correction of processing errors is made, i.e., the causes of processing errors are prevented

    Tolerance analysis and synthesis of assemblies subject to loading with process integration and design optimization tools

    Get PDF
    Manufacturing variation results in uncertainty in the functionality and performance of mechanical assemblies. Management of this uncertainty is of paramount importance for manufacturing efficiency. Methods focused on the management of uncertainty and variation in the design of mechanical assemblies, such as tolerance analysis and synthesis, have been subject to extensive research and development to date. However, due to the challenges involved, limitations in the capability of these methods remain. These limitations are associated with the following problems: The identification of Key Product Characteristics (KPCs) in mechanical assemblies (which are required for measuring functional performance) without imposing significant modelling demands.  Accommodation of the high computational cost of traditional statistical tolerance analysis in early design where analysis budgets are limited. Efficient identification of feasible regions and optimum performance within the large design spaces associated with early design stages.  The ability to comprehensively accommodate tolerance analysis problems in which assembly functionality is dependent on the effects of loading (such as compliance or multi‐body dynamics). Current Computer Aided Tolerancing (CAT) is limited by: the ability to accommodate only specific loading effects; reliance on custom simulation codes with limited practical implementation in accessible software tools; and, the need for additional expertise in formulating specific assembly tolerance models and interpreting results. Accommodation of the often impractically high computational cost of tolerance synthesis involving demanding assembly models (particularly assemblies under loading). The high computational cost is associated with traditional statistical tolerancing Uncertainty Quantification (UQ) methods reliant on low‐efficiency Monte Carlo (MC) sampling. This research is focused on addressing these limitations, by developing novel methods for enhancing the engineering design of mechanical assemblies involving uncertainty or variation in design parameters. This is achieved by utilising the emerging design analysis and refinement capabilities of Process Integration and Design Optimization (PIDO) tools. ii The main contributions of this research are in three main themes:  Design analysis and refinement accommodating uncertainty in early design;  Tolerancing of assemblies subject to loading; and, efficient Uncertainty Quantification (UQ) in tolerance analysis and synthesis. The research outcomes present a number of contributions within each research theme, as outlined below. Design analysis and refinement accommodating uncertainty in early design: A PIDO tool based visualization method to aid designers in identifying assembly KPCs in early design stages. The developed method integrates CAD software functionally with the process integration, UQ, data logging and statistical analysis capabilities of PIDO tools, to simulate manufacturing variation in an assembly and visualise assembly clearances, contacts or interferences. The visualization capability subsequently assists the designer in specifying critical assembly dimensions as KPCs.  Computationally efficient method for manufacturing sensitivity analysis of assemblies with linear‐compliant elements. Reduction in computational cost are achieved by utilising linear‐compliant assembly stiffness measures, reuse of CAD models created in early design stages, and PIDO tool based tolerance analysis. The associated increase in computational efficiency, allows an estimate of sensitivity to manufacturing variation to be made earlier in the design process with low effort.  Refinement of concept design embodiments through PIDO based DOE analysis and optimization. PIDO tools are utilised to allow CAE tool integration, and efficient reuse of models created in early design stages, to rapidly identify feasible and optimal regions in the design space. A case study focused on the conceptual design of automotive seat kinematics is presented, in which an optimal design is identified and subsequently selected for commercialisation in the Tesla Motors Model S full‐sized electric sedan. These contributions can be directly applied to improve the design of mechanical assemblies involving uncertainty or variation in design parameters in the early stages of design. The use of native CAD/E models developed as part of an established design modelling procedure imposes low additional modelling effort. Tolerancing of assemblies subject to loading:  A novel tolerance analysis platform is developed which integrates CAD/E and statistical analysis tools using PIDO tool capabilities to facilitate tolerance analysis of assemblies subject to loading. The proposed platform extends the capabilities of traditional CAT tools and methods by enabling tolerance analysis of assemblies which are dependent on iii the effects of loads. The ability to accommodate the effects of loading in tolerance analysis allows for an increased level of capability in estimating the effects of variation on functionality.  The interdisciplinary integration capabilities of the PIDO based platform allow for CAD/E models created as part of the standard design process to be used for tolerance analysis. The need for additional modelling tools and expertise is subsequently reduced.  Application of the developed platform resulted in effective solutions to practical, industry based tolerance analysis problems, including: an automotive actuator mechanism assembly consisting of rigid and compliant components subject to external forces; and a rotary switch and spring loaded radial detent assembly in which functionality is defined by external forces and internal multi‐body dynamics. In both case studies the tolerance analysis platform was applied to specify nominal dimensions and required tolerances to achieve the desired assembly yield. The computational platform offers an accessible tolerance analysis approach for accommodating assemblies subject to loading with low implementation demands. Efficient Uncertainty Quantification (UQ) in tolerance analysis and synthesis:  A novel approach is developed for addressing the high computational cost of Monte Carlo (MC) sampling in statistical tolerance analysis and synthesis, with Polynomial Chaos Expansion (PCE) uncertainty quantification. Compared to MC sampling, PCE offers significantly higher efficiency. The feasibility of PCE based UQ in tolerance synthesis is established through: theoretical analysis of the PCE method identifying working principles, implementation requirements, advantages and limitations; identification of a preferred method for determining PCE expansion coefficients in tolerance analysis; and, formulation of an approach for the validation of PCE statistical moment estimates.  PCE based UQ is subsequently implemented in a PIDO based tolerance synthesis platform for assemblies subject to loading. The resultant PIDO based tolerance synthesis platform integrates: highly efficient sparse grid based PCE UQ, parametric CAD/E models accommodating the effects of loading, cost‐tolerance modelling, yield quantification with Process Capability Indices (PCI), optimization of tolerance cost and yield with multiobjective Genetic Algorithm (GA).  To demonstrate the capabilities of the developed platform, two industry based case studies are used for validation, including: an automotive seat rail assembly consisting of compliant components subject to loading; and an automotive switch in assembly in which functionality is defined by external forces and multi‐body dynamics. In both case studies optimal tolerances were identified which satisfied desired yield and tolerance cost objectives. The addition of PCE to the tolerance synthesis platform resulted in large computational cost reductions without compromising accuracy compared to traditional MC methods. With traditional MC sampling UQ the required computational expense is impractically high. The resulting tolerance synthesis platform can be applied to tolerance analysis and synthesis with significantly reduced computation time while maintaining accurac

    Modelling and controlling variation propagation in mechanical assembly of high speed rotating machines

    Get PDF
    Assembly plays a vital role in the quality of a final product and has a great impact on the manufacturing cost. The mechanical assemblies consist of parts that inevitably have variations from their ideal dimensions. These variations propagate and accumulate as parts are assembled together. Excessive amount of variations in an assembly may cause improper functionality of the product being assembled. Improving assembly quality and reducing the assembly time and cost are the main objectives of this thesis. The quality of an assembly is determined in terms of variations in critical assembly dimensions, also known as Key Characteristics (KCs). Key Characteristics are designated to indicate where excess variation will affect product quality and what product features and tolerances require special attention. In order to improve assembly quality and reduce assembly time and cost, it is necessary to: (1) model non-ideal parts based on tolerances defined in design standards or current industrial practice of component inspection, (2) model assemblies and their associated assembly processes to analyse tolerance stack-up in the assembly, (3) develop probabilistic model to predict assembly variation after product assembly, and (4) implement control strategies for minimising assembly variation propagations to find optimum configuration of the assembly. Two assembly models have been developed, a linear model and a fully non-linear model for calculating assembly variation propagations. The assembly models presented in this thesis also allows for inclusion of geometric feature variation of each assembly component. Methods of incorporating geometric feature variations into an assembly variation model are described and analysis techniques are explained. The assembly variation model and the geometric variation models have been developed for 20 and 3D assemblies. Modelling techniques for incorporating process and measurement noise are also developed and described for the nonlinear assembly model and results are given to demonstrate the calculation of assembly variations while considering part, process and measurement errors. Two assembly case studies originating in sub-assemblies of aero-engines have been studied: Case Study 1, representing the rotating part (rotor) of an aero-engine, and Case Study 2, representing non-rotating part (stator) of an aero-engine. A probabilistic method based on the linear model is presented as a general analytical method for analysis of 3D mechanical assemblies. Probability density functions are derived for assembly position errors to analyse a general mechanical assembly, and separate probability functions are derived for the Key Characteristics (KCs) for assembly in Case Studies 1 and 2. The derived probability functions are validated by using the Monte Carlo simulation method based on the exact (full non-linear) model. Results showed that the proposed probabilistic method of estimating tolerance accumulation in mechanical assemblies is very efficient and accurate when compared to the Monte Carlo simulation method, particularly if large variations at the tails of the distributions are considered. Separate control strategies have been implemented for each case study. Four methods are proposed to minimise assembly variations for Case Study 1, and one error minimisation method is suggested for assemblies of Case Study 2. Based on the developed methods to optimise assembly quality, the two case studies were investigated, and it was found that the proposed optimisation methods can significantly improve assembly quality. The developed optimisation methods do not require any special tooling (such as fixtures) and can easily be implemented in practice
    corecore