24 research outputs found

    Rotational placement of irregular polygons over containers with fixed dimensions using simulated annealing and no-fit polygons

    Get PDF
    This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular bi-dimensional small items inside a bi-dimensional large object. This problem is approached with an heuristic based on simulated annealing. Traditional " external penalization" techniques are avoided through the application of the no-fit polygon, that determinates the collision-free region for each small item before its placement. The simulated annealing controls: the rotation applied and the placement of the small item. For each non-placed small item, a limited depth binary search is performed to find a scale factor that when applied to the small item, would allow it to be fitted in the large object. Three possibilities to define the sequence on which the small items are placed are studied: larger-first, random permutation and weight sorted. The proposed algorithm is suited for non-convex small items and large objects

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∌99% of the euchromatic genome and is accurate to an error rate of ∌1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Molecular and functional properties of P2X receptors—recent progress and persisting challenges

    Full text link

    Data Analytics for Noise Reduction in Optical Metrology of Reflective Planar Surfaces

    No full text
    On-line data collection from the manufactured parts is an essential element in Industry 4.0 to monitor the production’s health, which required strong data analytics. The optical metrology-based inspection of highly reflective parts in a production line, such as parts with metallic surfaces, is a difficult challenge. As many on-line inspection paradigms require the use of optical sensors, this reflectivity can lead to large amounts of noise, rendering the scan inaccurate. This paper discusses a method for noise reduction and removal in datapoints resulting from scanning the reflective planar surfaces. Utilizing a global statistic-based iterative approach, noise is gradually removed from the dataset at increasing percentages. The change in the standard deviation of point-plane distances is examined, and an optimal amount of noisy data is removed to reduce uncertainty in representing the workpiece. The developed algorithm provides a fast and efficient method for noise reduction in optical coordinate metrology and scanning

    Data Analytics for Noise Reduction in Optical Metrology of Reflective Planar Surfaces

    No full text
    On-line data collection from the manufactured parts is an essential element in Industry 4.0 to monitor the production’s health, which required strong data analytics. The optical metrology-based inspection of highly reflective parts in a production line, such as parts with metallic surfaces, is a difficult challenge. As many on-line inspection paradigms require the use of optical sensors, this reflectivity can lead to large amounts of noise, rendering the scan inaccurate. This paper discusses a method for noise reduction and removal in datapoints resulting from scanning the reflective planar surfaces. Utilizing a global statistic-based iterative approach, noise is gradually removed from the dataset at increasing percentages. The change in the standard deviation of point-plane distances is examined, and an optimal amount of noisy data is removed to reduce uncertainty in representing the workpiece. The developed algorithm provides a fast and efficient method for noise reduction in optical coordinate metrology and scanning

    Smart Topology Optimization Using Adaptive Neighborhood Simulated Annealing

    No full text
    Topology optimization (TO) of engineering products is an important design task to maximize performance and efficiency, which can be divided into two main categories of gradient-based and non-gradient-based methods. In recent years, significant attention has been brought to the non-gradient-based methods, mainly because they do not demand access to the derivatives of the objective functions. This property makes them well compatible to the structure of knowledge in the digital design and simulation domains, particularly in Computer Aided Design and Engineering (CAD/CAE) environments. These methods allow for the generation and evaluation of new evolutionary solutions without using the sensitivity information. In this work, a new non-gradient TO methodology using a variation of Simulated Annealing (SA) is presented. This methodology adaptively adjusts newly-generated candidates based on the history of the current solutions and uses the crystallization heuristic to smartly control the convergence of the TO problem. If the changes in the previous solutions of an element and its neighborhood improve the results, the crystallization factor increases the changes in the newly random generated solutions. Otherwise, it decreases the value of changes in the recently generated solutions. This methodology wisely improves the random exploration and convergence of the solutions in TO. In order to study the role of the various parameters in the algorithm, a variety of experiments are conducted and results are analyzed. In multiple case studies, it is shown that the final results are well comparable to the results obtained from the classic gradient-based methods. As an additional feature, a density filter is added to the algorithm to remove discontinuities and gray areas in the final solution resulting in robust outcomes in adjustable resolutions
    corecore