76 research outputs found
2D tunable all-solid-state random laser in the visible
A two-dimensional (2D) solid-state random laser emitting in the visible is
demonstrated, in which optical feedback is provided by a controlled disordered
arrangement of air-holes in a dye-doped polymer film. We find an optimal
scatterer density for which threshold is minimum and scattering is the
strongest. We show that the laser emission can be red-shifted by either
decreasing scatterer density or increasing pump area. We show that spatial
coherence is easily controlled by varying pump area. Such a 2D random laser
provides with a compact on-chip tunable laser source and a unique platform to
explore non-Hermitian photonics in the visibleComment: 5 pages, 4 figure
Stratégies de mise en oeuvre des polytopes en analyse de tolérance
In geometric tolerancing analysis area, a classical approach consists in handling polyhedrons coming from sets of linear constraints. The relative position between any two surfaces of a mechanism is determined by operations (Minkowski sum and intersection) on these polyhedrons. The polyhedrons are generally unbounded due to the inclusion of degrees of invariance for surfaces and degrees of freedom for joints defining theoretically unlimited displacements.In a first part are introduced the cap half-spaces to limit these displacements in order to transform the polyhedron into polytopes. This method requires controlling the influence of these additional half-spaces on the topology of calculated polytopes. This is necessary to ensure the traceability of these half-spaces through the tolerancing analysis process.A second part provides an inventory of the issues related to the numerical implementation of polytopes. One of them depends on the choice of a computation configuration (expression point and base, homogenization coefficients) to define a polytope. After proving that the modification of a computation configuration is an affine transformation, several simulation strategies are listed in order to understand the problems of numerical precision and computation time.En analyse de tolérances géométriques, une approche consiste à manipuler des polyèdres de R' issus d’ensembles de contraintes linéaires. La position relative entre deux surfaces quelconques d'un mécanisme est déterminée par des opérations (somme de Minkowski et intersection) sur ces polyèdres. Ces polyèdres ne sont pas bornés selon les déplacements illimités dus aux degrés d’invariance des surfaces et aux degrés de liberté des liaisons.Dans une première partie sont introduits des demi-espaces "bouchons" destinés à limiter ces déplacements afin de transformer les polyèdres en polytopes. Cette méthode implique de maîtriser l’influence des demi-espaces bouchons sur la topologie des polytopes résultants. Ceci est primordial pour garantir la traçabilité de ces demi-espaces dans le processus d’analyse de tolérances.Une seconde partie dresse un inventaire des problématiques de mise en oeuvre numérique des polytopes. L’une d’entre elles repose sur le choix d’une configuration de calcul (point et base d’expression, coefficients d’homogénéisation) pour définir un polytope. Après avoir montré que le changement de configuration de calcul est une transformation affine, plusieurs stratégies de simulations sont déclinées afin d’appréhender les problèmes de précision numérique et de temps de calculs
Tolerance Analysis by Polytopes
To determine the relative position of any two surfaces in a system, one
approach is to useoperations (Minkowski sum and intersection) on sets of
constraints. These constraints aremade compliant with half-spaces of R^n where
each set of half-spaces defines an operandpolyhedron. These operands are
generally unbounded due to the inclusion of degrees ofinvariance for surfaces
and degrees of freedom for joints defining theoretically
unlimiteddisplacements. To solve operations on operands, Minkowski sums in
particular, "cap" halfspacesare added to each polyhedron to make it compliant
with a polytope which is bydefinition a bounded polyhedron. The difficulty of
this method lies in controlling the influenceof these additional half-spaces on
the topology of polytopes calculated by sum or intersection.This is necessary
to validate the geometric tolerances that ensure the compliance of amechanical
system in terms of functional requirements
A Framework for Integration of Resource Allocation and Reworking Concept into Design Optimisation Problem
The life cycle of an assembled product faces various uncertainties considering the current state of the manufacturing line. Varied of activities are integrated with the manufacturing line including processing, inspection, reworking, assembly, etc. Therefore, any decision taken concerning each activity, will affect the end-product of the manufacturing line. In an early stage, designers define tolerances on parts to ensure the functionality of the end-product. In this regard, this paper integrates resource allocation (as a decision to assign practical resources to parts) and reworking decision (as a decision to improve parts conformity rate) into the tolerance allocation problem. A modular-based cost modelling approach is proposed objecting to minimisation of manufacturing cost concerning resource allocation and reworking decisions. Eventually, a genetic algorithm and Monte-Carlo simulation are adapted to analyse the applicability of the model
Dynamic Pricing Model for Batch-Specific Tolerance Allocation in Collaborative Production Networks
Review of data mining applications for quality assessment in manufacturing industry: Support Vector Machines
In many modern manufacturing industries, data that characterize the manufacturing process are electronically collected and stored in the databases. Due to advances in data collection systems and analysis tools, data mining (DM) has widely been applied for quality assessment (QA) in manufacturing industries. In DM, the choice of technique to use in analyzing a dataset and assessing the quality depend on the understanding of the analyst. On the other hand, with the advent of improved and efficient prediction techniques, there is a need for an analyst to know which tool performs best for a particular type of data set. Although a few review papers have recently been published to discuss DM applications in manufacturing for QA, this paper provides an extensive review to investigate the application of a special DM technique, namely support vector machine (SVM) to solve QA problems. The review provides a comprehensive analysis of the literature from various points of view as DM preliminaries, data preprocessing, DM applications for each quality task, SVM preliminaries, and application results. Summary tables and figures are also provided besides to the analyses. Finally, conclusions and future research directions are provided
A Framework for Integration of Resource Allocation and Reworking Concept into Design Optimisation Problem
The life cycle of an assembled product faces various uncertainties considering the current state of the manufacturing line. Varied of activities are integrated with the manufacturing line including processing, inspection, reworking, assembly, etc. Therefore, any decision taken concerning each activity, will affect the end-product of the manufacturing line. In an early stage, designers define tolerances on parts to ensure the functionality of the end-product. In this regard, this paper integrates resource allocation (as a decision to assign practical resources to parts) and reworking decision (as a decision to improve parts conformity rate) into the tolerance allocation problem. A modular-based cost modelling approach is proposed objecting to minimisation of manufacturing cost concerning resource allocation and reworking decisions. Eventually, a genetic algorithm and Monte-Carlo simulation are adapted to analyse the applicability of the model
Comparison of optimization techniques in a tolerance analysis approach considering form defects
In tolerancing analysis area, the most various existing approaches do not take form defects of parts into consideration. As high precisions assemblies cannot be analyzed with the assumption that form defects are negligible, the paper focuses in particular on the study of the form defects impacts on the assembly simulation and that by comparing two optimization algorithms (iHLRF and Quapro). The study is limited firstly to the cylinders. For the optimization, two main types of surfaces modelling are considered: difference surface-based method and real model. The compared models allow assessing the non-interferences between cylinders with form defects, potentially in contact. This is in the main issue to validate a tolerance analysis approach
- …