3,125 research outputs found

    Cell Production System Design: A Literature Review

    Get PDF
    Purpose In a cell production system, a number of machines that differ in function are housed in the same cell. The task of these cells is to complete operations on similar parts that are in the same group. Determining the family of machine parts and cells is one of the major design problems of production cells. Cell production system design methods include clustering, graph theory, artificial intelligence, meta-heuristic, simulation, mathematical programming. This article discusses the operation of methods and research in the field of cell production system design. Methodology: To examine these methods, from 187 articles published in this field by authoritative scientific sources, based on the year of publication and the number of restrictions considered and close to reality, which are searched using the keywords of these restrictions and among them articles Various aspects of production and design problems, such as considering machine costs and cell size and process routing, have been selected simultaneously. Findings: Finally, the distribution diagram of the use of these methods and the limitations considered by their researchers, shows the use and efficiency of each of these methods. By examining them, more efficient and efficient design fields of this type of production system can be identified. Originality/Value: In this article, the literature on cell production system from 1972 to 2021 has been reviewed

    Capuchin Search Particle Swarm Optimization (CS-PSO) based Optimized Approach to Improve the QoS Provisioning in Cloud Computing Environment

    Get PDF
    This review introduces the methods for further enhancing resource assignment in distributed computing situations taking into account QoS restrictions. While resource distribution typically affects the quality of service (QoS) of cloud organizations, QoS constraints such as response time, throughput, hold-up time, and makespan are key factors to take into account. The approach makes use of a methodology from the Capuchin Search Particle Large Number Improvement (CS-PSO) apparatus to smooth out resource designation while taking QoS constraints into account. Throughput, reaction time, makespan, holding time, and resource use are just a few of the objectives the approach works on. The method divides the resources in an optimum way using the K-medoids batching scheme. During batching, projects are divided into two-pack assembles, and the resource segment method is enhanced to obtain the optimal configuration. The exploratory association makes use of the JAVA device and the GWA-T-12 Bitbrains dataset for replication. The outrageous worth advancement problem of the multivariable capacity is addressed using the superior calculation. The simulation findings demonstrate that the core (Cloud Molecule Multitude Improvement, CPSO) computation during 500 ages has not reached assembly repeatedly, repeatedly, repeatedly, and repeatedly, respectively.The connection analysis reveals that the developed model outperforms the state-of-the-art approaches. Generally speaking, this approach provides significant areas of strength for a successful procedure for improving resource designation in distributed processing conditions and can be applied to address a variety of resource segment challenges, such as virtual machine setup, work arranging, and resource allocation. Because of this, the capuchin search molecule enhancement algorithm (CSPSO) ensures the success of the improvement measures, such as minimal streamlined polynomial math, rapid consolidation speed, high productivity, and a wide variety of people

    Assembly and Disassembly Planning by using Fuzzy Logic & Genetic Algorithms

    Full text link
    The authors propose the implementation of hybrid Fuzzy Logic-Genetic Algorithm (FL-GA) methodology to plan the automatic assembly and disassembly sequence of products. The GA-Fuzzy Logic approach is implemented onto two levels. The first level of hybridization consists of the development of a Fuzzy controller for the parameters of an assembly or disassembly planner based on GAs. This controller acts on mutation probability and crossover rate in order to adapt their values dynamically while the algorithm runs. The second level consists of the identification of theoptimal assembly or disassembly sequence by a Fuzzy function, in order to obtain a closer control of the technological knowledge of the assembly/disassembly process. Two case studies were analyzed in order to test the efficiency of the Fuzzy-GA methodologies

    Linking Customer Retention to Intelligent Technology: An Optimization Approach

    Get PDF
    Marketing managers in the telecommunication sectors are confronted with considerable complexity. They have to make decisions about the optimum combination of products or offerings, customer groups and the means of interacting with potential customers. Further, in saturated markets such as mobile telephony, it is increasingly important to retain customers potentially to churn. On the optimal campaign planning, this research describes how the customer survey was conducted for those potentially churning customers based on which an optimal campaign planning was followed. This research engages with the subjects of customer retention from the perspective of a major mobile operator in Taiwan. Customers’ preferences with C&C (campaign offer and communication channel) were predicted and input for further analysis for target selection optimization. These models was proved novel in an organizational prototype project suggesting that the use of the hybrid of data mining and optimization approaches can be effective for target selection

    A Data Mining Practical Approach to Inventory Management and Logistics Optimization

    Get PDF
    The latent demand to optimize costs and customer service has been fostered in the current economic situations, characterized by high competitiveness and disruption in supply chains, placing inventories as a vital sector with significant potential to implement improvements in firms. Inventory management that is done correctly has a favorable impact on logistics performance indexes. Warehousing operations account for around 15% of logistics expenditures in terms of dollars. This article employs a method based on the Partitioning Around Medoids algorithm that incorporates, in a novel way, the application of a strategy for locating the optimal picking point based on cluster classification, taking into account the qualitative and quantitative factors that have the greatest impact or priority on inventory management in the company. The results obtained with this model improve the routes of distributed materials based on the identification of their characteristics such as the frequency of collection and handling of materials, allowing for the reorganization and expansion of storage capacity of the various SKUs, moving from a classification by families to a cluster classification. This article shows a suggestion for a warehouse distribution design using data mining techniques, which uses indicators and key qualities for operational success for a case study in a corporation, as well as an approach to improve inventory management decision-making

    Novel Load Balancing Optimization Algorithm to Improve Quality-of-Service in Cloud Environment

    Get PDF
    Scheduling cloud resources calls for allocating cloud assets to cloud tasks. It is possible to improve scheduling outcomes by treating Quality of Service (QoS) factors as essential constraints. However, efficient scheduling calls for improved optimization of QoS parameters, and only a few resource scheduling algorithms in the available literature do so. The primary objective of this paper is to provide an effective method for deploying workloads to cloud infrastructure. To ensure that workloads are executed efficiently on available resources, a resource scheduling method based on particle swarm optimization was developed. The proposed method's performance has been measured in the cloud. The experimental results prove the efficiency of the proposed approach in reducing the aforementioned QoS parameters. Several metrics of algorithm performance are used to gauge how well the algorithm performs

    Search based software engineering: Trends, techniques and applications

    Get PDF
    © ACM, 2012. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version is available from the link below.In the past five years there has been a dramatic increase in work on Search-Based Software Engineering (SBSE), an approach to Software Engineering (SE) in which Search-Based Optimization (SBO) algorithms are used to address problems in SE. SBSE has been applied to problems throughout the SE lifecycle, from requirements and project planning to maintenance and reengineering. The approach is attractive because it offers a suite of adaptive automated and semiautomated solutions in situations typified by large complex problem spaces with multiple competing and conflicting objectives. This article provides a review and classification of literature on SBSE. The work identifies research trends and relationships between the techniques applied and the applications to which they have been applied and highlights gaps in the literature and avenues for further research.EPSRC and E

    Decision support system for vendor managed inventory supply chain:a case study

    Get PDF
    Vendor-managed inventory (VMI) is a widely used collaborative inventory management policy in which manufacturers manages the inventory of retailers and takes responsibility for making decisions related to the timing and extent of inventory replenishment. VMI partnerships help organisations to reduce demand variability, inventory holding and distribution costs. This study provides empirical evidence that significant economic benefits can be achieved with the use of a genetic algorithm (GA)-based decision support system (DSS) in a VMI supply chain. A two-stage serial supply chain in which retailers and their supplier are operating VMI in an uncertain demand environment is studied. Performance was measured in terms of cost, profit, stockouts and service levels. The results generated from GA-based model were compared to traditional alternatives. The study found that the GA-based approach outperformed traditional methods and its use can be economically justified in small- and medium-sized enterprises (SMEs)

    Disaster Recovery Services in Intercloud using Genetic Algorithm Load Balancer

    Get PDF
    Paradigm need to shifts from cloud computing to intercloud for disaster recoveries, which can outbreak anytime and anywhere. Natural disaster treatment includes radically high voluminous impatient job request demanding immediate attention. Under the disequilibrium circumstance, intercloud is more practical and functional option. There are need of protocols like quality of services, service level agreement and disaster recovery pacts to be discussed and clarified during the initial setup to fast track the distress scenario. Orchestration of resources in large scale distributed system having muli-objective optimization of resources, minimum energy consumption, maximum throughput, load balancing, minimum carbon footprint altogether is quite challenging. Intercloud where resources of different clouds are in align, plays crucial role in resource mapping. The objective of this paper is to improvise and fast track the mapping procedures in cloud platform and addressing impatient job requests in balanced and efficient manner. Genetic algorithm based resource allocation is proposed using pareto optimal mapping of resources to keep high utilization rate of processors, high througput and low carbon footprint.  Decision variables include utilization of processors, throughput, locality cost and real time deadline. Simulation results of load balancer using first in first out and genetic algorithm are compared under similar circumstances

    Risk-Aware Planning for Sensor Data Collection

    Get PDF
    With the emergence of low-cost unmanned air vehicles, civilian and military organizations are quickly identifying new applications for affordable, large-scale collectives to support and augment human efforts via sensor data collection. In order to be viable, these collectives must be resilient to the risk and uncertainty of operating in real-world environments. Previous work in multi-agent planning has avoided planning for the loss of agents in environments with risk. In contrast, this dissertation presents a problem formulation that includes the risk of losing agents, the effect of those losses on the mission being executed, and provides anticipatory planning algorithms that consider risk. We conduct a thorough analysis of the effects of risk on path-based planning, motivating new solution methods. We then use hierarchical clustering to generate risk-aware plans for a variable number of agents, outperforming traditional planning methods. Next, we provide a mechanism for distributed negotiation of stable plans, utilizing coalitional game theory to provide cost allocation methods that we prove to be fair and stable. Centralized planning with redundancy is then explored, planning for parallel task completion to mitigate risk and provide further increased expected value. Finally, we explore the role of cost uncertainty as additional source of risk, using bi-objective optimization to generate sets of alternative plans. We demonstrate the capability of our algorithms on randomly generated problem instances, showing an improvement over traditional multi-agent planning methods as high as 500% on very large problem instances
    corecore