72 research outputs found

    Random Adjustment - Based Chaotic Metaheuristic Algorithms for Image Contrast Enhancement

    Get PDF
    Metaheuristic algorithm is a powerful optimization method, in which it can solve problemsby exploring the ordinarily large solution search space of these instances, that are believed tobe hard in general. However, the performances of these algorithms signicantly depend onthe setting of their parameter, while is not easy to set them accurately as well as completelyrelying on the problem\u27s characteristic. To ne-tune the parameters automatically, manymethods have been proposed to address this challenge, including fuzzy logic, chaos, randomadjustment and others. All of these methods for many years have been developed indepen-dently for automatic setting of metaheuristic parameters, and integration of two or more ofthese methods has not yet much conducted. Thus, a method that provides advantage fromcombining chaos and random adjustment is proposed. Some popular metaheuristic algo-rithms are used to test the performance of the proposed method, i.e. simulated annealing,particle swarm optimization, dierential evolution, and harmony search. As a case study ofthis research is contrast enhancement for images of Cameraman, Lena, Boat and Rice. Ingeneral, the simulation results show that the proposed methods are better than the originalmetaheuristic, chaotic metaheuristic, and metaheuristic by random adjustment

    Simulated Annealing Algorithm Combined with Chaos for Task Allocation in Real-Time Distributed Systems

    Get PDF
    This paper addresses the problem of task allocation in real-time distributed systems with the goal of maximizing the system reliability, which has been shown to be NP-hard. We take account of the deadline constraint to formulate this problem and then propose an algorithm called chaotic adaptive simulated annealing (XASA) to solve the problem. Firstly, XASA begins with chaotic optimization which takes a chaotic walk in the solution space and generates several local minima; secondly XASA improves SA algorithm via several adaptive schemes and continues to search the optimal based on the results of chaotic optimization. The effectiveness of XASA is evaluated by comparing with traditional SA algorithm and improved SA algorithm. The results show that XASA can achieve a satisfactory performance of speedup without loss of solution quality

    Random adjustment - based Chaotic Metaheuristic algorithms for image contrast enhancement

    Full text link

    The Use of Parallel Processing in VLSI Computer-Aided Design Application

    Get PDF
    Coordinated Science Laboratory was formerly known as Control Systems LaboratorySemiconductor Research Corporation / 87-DP-10

    ProperCAD II: A Run-Time Library for Portable, Parallel, Object-Oriented Programming with Applications to VLSI CAD

    Get PDF
    Coordinated Science Laboratory was formerly known as Control Systems LaboratorySemiconductor Research Corporation / grant 93-DP-10

    Methodology for modified whale optimization algorithm for solving appliances scheduling problem

    Get PDF
    Whale Optimization Algorithm (WOA) is considered as one of the newest metaheuristic algorithms to be used for solving a type of NP-hard problems. WOA is known of having slow convergence and at the same time, the computation of the algorithm will also be increased exponentially with multiple objectives and huge request from n users. The current constraints surely limit for solving and optimizing the quality of Demand Side Management (DSM) case, such as the energy consumption of indoor comfort index parameters which consist of thermal comfort, air quality, humidity and vision comfort.To address these issues, this proposed work will firstly justify and validate the constraints related to the appliances scheduling problem, and later proposes a new model of the Cluster based Multi-Objective WOA with multiple restart strategy. In order to achieve the objectives, different initialization strategy and cluster-based approaches will be used for tuning the main parameter of WOA under different MapReduce application which helps to control exploration and exploitation, and the proposed model will be tested on a set of well-known test functions and finally, will be applied on a real case project i.e. appliances scheduling problem. It is anticipating that the approach can expedite the convergence of meta-heuristic technique with quality solution

    Evolutionary Algorithms in Engineering Design Optimization

    Get PDF
    Evolutionary algorithms (EAs) are population-based global optimizers, which, due to their characteristics, have allowed us to solve, in a straightforward way, many real world optimization problems in the last three decades, particularly in engineering fields. Their main advantages are the following: they do not require any requisite to the objective/fitness evaluation function (continuity, derivability, convexity, etc.); they are not limited by the appearance of discrete and/or mixed variables or by the requirement of uncertainty quantification in the search. Moreover, they can deal with more than one objective function simultaneously through the use of evolutionary multi-objective optimization algorithms. This set of advantages, and the continuously increased computing capability of modern computers, has enhanced their application in research and industry. From the application point of view, in this Special Issue, all engineering fields are welcomed, such as aerospace and aeronautical, biomedical, civil, chemical and materials science, electronic and telecommunications, energy and electrical, manufacturing, logistics and transportation, mechanical, naval architecture, reliability, robotics, structural, etc. Within the EA field, the integration of innovative and improvement aspects in the algorithms for solving real world engineering design problems, in the abovementioned application fields, are welcomed and encouraged, such as the following: parallel EAs, surrogate modelling, hybridization with other optimization techniques, multi-objective and many-objective optimization, etc

    New Concepts for Virtual Testbeds : Data Mining Algorithms for Blackbox Optimization based on Wait-Free Concurrency and Generative Simulation

    Get PDF
    Virtual testbeds have emerged as a key technology for improving and streamlining complex engineering processes by delivering long-term simulation and assessment of complex designs in virtual environments. In contrast to existing simulation technology, virtual testbeds focus on long-term physically-based simulation of the overall design in its (virtual) environment instead of only focussing on isolated, specific parts for short periods of time. This technology has the major advantage that costly testing, prototyping, and assessment in real-life environments are replaced by a cost-efficient simulation in virtual worlds for comprehensive and long-term analysis of designs. For this purpose, engineering models and their requirements are abstracted into software simulation models and objectives which are executed in virtual assessments. Simulation models are used to predict complex, real systems which can be further a subject to random influences. These predictions are used to examine the effects of individual configuration alternatives without actually realizing them and causing possible negative effects on the real system. Virtual testbeds further offer engineers the opportunity to immersively and naturally interact with their simulation model in these virtual assessments. This enables a greater and comprehensive understanding of possible design flaws early-on in the design process for engineers because they can directly assess their design in the virtual environment, based on the simulation objectives. The fact that virtual testbeds enable these realtime interactive virtual assessments, makes their underlying software infrastructure very complex. One major challenge is to minimize the development time of virtual testbeds in order to efficiently integrate them into the overall engineering process. Usually, this can be achieved by minimizing the underlying concurrency of the testbed and by simplifying its software architecture. However, this may result in a degradation of their very concurrent and asynchronous behavior, which is usually required for immersive and natural virtual interaction. A major goal of virtual testbeds in the engineering process is to find a set of optimal configurations of the simulation model which maximizes all simulation objectives for the specified virtual assessments. Once such a set has been computed, engineers can interactively explore it in the virtual environment. The main challenge is that sophisticated simulation models and their configuration are subject to a multiobjective optimization problem, which usually can not be solved manually by engineers or simulation analysts in feasible time. This is further aggravated because the relationships between simulation model configurations and simulation objectives are mostly unknown, leading to what is known as blackbox simulations. In this thesis, I propose novel data mining algorithms for computing Pareto optimal simulation model configurations, based on an approximation of the feasible design space, for deterministic and stochastic blackbox simulations in virtual testbeds for achieving above stated goal. These novel data mining algorithms lead to an automatic knowledge discovery process that does not need any supervision for its data analysis and assessment for multiobjective optimization problems of simulation model configurations. This achieves the previously stated goal of computing optimal configurations of simulation models for long-term simulations and assessments. Furthermore, I propose two complementary solutions for efficiently integrating massively-parallel virtual testbeds into engineering processes. First, I propose a novel multiversion wait-free data and concurrency management based on hash maps. These wait-free hash maps do not require any standard locking mechanisms and enable low-latency data generation, management and distribution for massively-parallel applications. Second, I propose novel concepts for efficiently code generating above wait-free data and concurrency management for arbitrary massively-parallel simulation applications of virtual testbeds. My generative simulation concept combines a state-of-the-art realtime interactive system design pattern for high maintainability with template code generation based on domain specific modelling. This concept is able to generate massively-parallel simulations and, at the same time, model checks its internal dataflow for possible interface errors. These generative concept overcomes the challenge of efficiently integrating virtual testbeds into engineering processes. These contributions enable for the first time a powerful collaboration between simulation, optimization, visualization and data analysis for novel virtual testbed applications but also overcome and achieve the presented challenges and goals
    • …
    corecore