18 research outputs found

    Multicriteria pathfinding in uncertain simulated environments

    Get PDF
    Dr. James Keller, Dissertation Supervisor.Includes vita.Field of study: Electrical and computer engineering."May 2018."Multicriteria decision-making problems arise in all aspects of daily life and form the basis upon which high-level models of thought and behavior are built. These problems present various alternatives to a decision-maker, who must evaluate the trade-offs between each one and choose a course of action. In a sequential decision-making problem, each choice can influence which alternatives are available for subsequent actions, requiring the decision-maker to plan ahead in order to satisfy a set of objectives. These problems become more difficult, but more realistic, when information is restricted, either through partial observability or by approximate representations. Pathfinding in partially observable environments is one significant context in which a decision-making agent must develop a plan of action that satisfies multiple criteria. In general, the partially observable multiobjective pathfinding problem requires an agent to navigate to certain goal locations in an environment with various attributes that may be partially hidden, while minimizing a set of objective functions. To solve these types of problems, we create agent models based on the concept of a mental map that represents the agent's most recent spatial knowledge of the environment, using fuzzy numbers to represent uncertainty. We develop a simulation framework that facilitates the creation and deployment of a wide variety of environment types, problem definitions, and agent models. This computational mental map (CMM) framework is shown to be suitable for studying various types of sequential multicriteria decision-making problems, such as the shortest path problem, the traveling salesman problem, and the traveling purchaser problem in multiobjective and partially observable configurations.Includes bibliographical references (pages 294-301)

    Método NBI-EQMM com restrições multivariadas para otimização do processo de Torneamento Duro.

    Get PDF
    Esta tese apresenta o desenvolvimento e a avaliação do método NBI-EQMM com restrições multivariadas para problemas de otimização multiobjetivo não-linear de larga escala, com funções objetivo e restrições correlacionadas. No método, a seleção das funções que integram cada grupo é realizada aplicando-se a Análise Hierárquica de Cluster (AHC) assistida por uma matriz de distâncias. Para testar a adequação da proposta, um arranjo composto central (CCD) com 3 variáveis de entrada (x) e 22 respostas (Y) foi desenvolvido com vistas a otimizar o processo de torneamento do aço endurecido ABNT H13, usinado com as ferramentas Wiper PCBN 7025AWG, CC 6050WG e CC 650WG. As 22 superfícies de resposta foram definidas para que o problema pudesse contemplar cinco dimensões de um processo real, em escala industrial: qualidade, custo, produtividade, viabilidade econômica e financeira e sustentabilidade. Os resultados obtidos indicam que o método NBI-EQMM com restrições multivariadas de igualdade e desigualdade contribuiu para a formação de fronteiras equiespaçadas e sem inversão dos sinais de correlação das respostas originais, conduzindo todas as respostas para valores próximos aos seus alvos, sem desrespeitar as restrições multivariadas pré-estabelecidas. Foi observado que a inclusão das restrições multivariadas para o cálculo da matriz Payoff permite o reescalonamento da fronteira de Pareto, aproximando as soluções ótimas obtidas de seus ótimos individuais, evitando que soluções Pareto-ótimo fora da região de solução viável sejam obtidas. Observou-se ainda que, quando os eixos da fronteira de Pareto são formados por respostas positivamente correlacionadas e com o mesmo sentido de otimização ou negativamente correlacionadas e com o sentido de otimização diferente, o método NBI bivariado falha, corroborando, portanto, a necessidade de aplicação do método NBI-EQMM proposto. Considerando que um importante fator para a competitividade das organizações é a fabricação de produtos em grande escala, com custo mínimo e aliada a padrões de qualidade compatíveis aos exigidos pelos clientes, pode-se dizer que a ferramenta CC 6050WG conseguiu atender, simultaneamente, a todas essas características, sendo, portanto, considerada a mais eficiente entre as ferramentas analisadas nesta Tese

    The bi-objective travelling salesman problem with profits and its connection to computer networks.

    Get PDF
    This is an interdisciplinary work in Computer Science and Operational Research. As it is well known, these two very important research fields are strictly connected. Among other aspects, one of the main areas where this interplay is strongly evident is Networking. As far as most recent decades have seen a constant growing of every kind of network computer connections, the need for advanced algorithms that help in optimizing the network performances became extremely relevant. Classical Optimization-based approaches have been deeply studied and applied since long time. However, the technology evolution asks for more flexible and advanced algorithmic approaches to model increasingly complex network configurations. In this thesis we study an extension of the well known Traveling Salesman Problem (TSP): the Traveling Salesman Problem with Profits (TSPP). In this generalization, a profit is associated with each vertex and it is not necessary to visit all vertices. The goal is to determine a route through a subset of nodes that simultaneously minimizes the travel cost and maximizes the collected profit. The TSPP models the problem of sending a piece of information through a network where, in addition to the sending costs, it is also important to consider what “profit” this information can get during its routing. Because of its formulation, the right way to tackled the TSPP is by Multiobjective Optimization algorithms. Within this context, the aim of this work is to study new ways to solve the problem in both the exact and the approximated settings, giving all feasible instruments that can help to solve it, and to provide experimental insights into feasible networking instances

    User-Oriented Methodology and Techniques of Decision Analysis and Support

    Get PDF
    This volume contains 26 papers selected from Workshop presentations. The book is divided into two sections; the first is devoted to the methodology of decision analysis and support and related theoretical developments, and the second reports on the development of tools -- algorithms, software packages -- for decision support as well as on their applications. Several major contributions on constructing user interfaces, on organizing intelligent DSS, on modifying theory and tools in response to user needs -- are included in this volume

    Journal of Telecommunications and Information Technology, 2003, nr 3

    Get PDF
    kwartalni

    Modeling and optimization of turn-milling processes for cutting parameter selection

    Get PDF
    Turn-milling is a relatively new machining process technology offering important advantages such as increased productivity, reduced tool wear and better surface finish. Because two conventional cutting processes turning and milling are combined in turn-milling, there are many parameters that affect the process making their optimal selection challenging. Optimization studies performed on turn-milling processes are very limited and consider one objective at a time. In this work, orthogonal turn-milling is considered where spindle and work rotational speeds, cutter (tool-work axes) offset, depth of cut and feed per revolution are selected as process parameters. The effects of each parameter on tool wear, surface roughness, circularity, cusp height, material removal rate (MRR) and cutting forces were investigated through process model based simulations and experiments carried out on a multi-tasking CNC machine tool. Tool life and surface roughness are formulated including cutter offset for the first time in this present work. Also, for the first time, turn-milling process is defined as a multi-objective problem and an effective method is proposed to handle this optimization problem. Minimum surface error, minimum production cost and minimum production time are aimed at the same time, and results are generated for selection of optimal cutting process parameters. After optimal parameter sets are found, they are compared with the parameters proposed by tool suppliers in machining tests. In addition, orthogonal turn-milling process is compared with conventional turning process comprehensively in order to demonstrate the process advantages

    Enhanced Harris's Hawk algorithm for continuous multi-objective optimization problems

    Get PDF
    Multi-objective swarm intelligence-based (MOSI-based) metaheuristics were proposed to solve multi-objective optimization problems (MOPs) with conflicting objectives. Harris’s hawk multi-objective optimizer (HHMO) algorithm is a MOSIbased algorithm that was developed based on the reference point approach. The reference point is determined by the decision maker to guide the search process to a particular region in the true Pareto front. However, HHMO algorithm produces a poor approximation to the Pareto front because lack of information sharing in its population update strategy, equal division of convergence parameter and randomly generated initial population. A two-step enhanced non-dominated sorting HHMO (2SENDSHHMO) algorithm has been proposed to solve this problem. The algorithm includes (i) a population update strategy which improves the movement of hawks in the search space, (ii) a parameter adjusting strategy to control the transition between exploration and exploitation, and (iii) a population generating method in producing the initial candidate solutions. The population update strategy calculates a new position of hawks based on the flush-and-ambush technique of Harris’s hawks, and selects the best hawks based on the non-dominated sorting approach. The adjustment strategy enables the parameter to adaptively changed based on the state of the search space. The initial population is produced by generating quasi-random numbers using Rsequence followed by adapting the partial opposition-based learning concept to improve the diversity of the worst half in the population of hawks. The performance of the 2S-ENDSHHMO has been evaluated using 12 MOPs and three engineering MOPs. The obtained results were compared with the results of eight state-of-the-art multi-objective optimization algorithms. The 2S-ENDSHHMO algorithm was able to generate non-dominated solutions with greater convergence and diversity in solving most MOPs and showed a great ability in jumping out of local optima. This indicates the capability of the algorithm in exploring the search space. The 2S-ENDSHHMO algorithm can be used to improve the search process of other MOSI-based algorithms and can be applied to solve MOPs in applications such as structural design and signal processing

    Numerical and Evolutionary Optimization 2020

    Get PDF
    This book was established after the 8th International Workshop on Numerical and Evolutionary Optimization (NEO), representing a collection of papers on the intersection of the two research areas covered at this workshop: numerical optimization and evolutionary search techniques. While focusing on the design of fast and reliable methods lying across these two paradigms, the resulting techniques are strongly applicable to a broad class of real-world problems, such as pattern recognition, routing, energy, lines of production, prediction, and modeling, among others. This volume is intended to serve as a useful reference for mathematicians, engineers, and computer scientists to explore current issues and solutions emerging from these mathematical and computational methods and their applications

    Optimization Models Using Fuzzy Sets and Possibility Theory

    Get PDF
    Optimization is of central concern to a number of disciplines. Operations Research and Decision Theory are often considered to be identical with optimization. But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i.e. the solutions were considered to be either feasible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeler to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real problems. This is particularly true if the problem under consideration includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if natural language has to be modeled or if state variables can only be described approximately. Until recently, everything which was not known with certainty, i.e. which was not known to be either true or false or which was not known to either happen with certainty or to be impossible to occur, was modeled by means of probabilities. This holds in particular for uncertainties concerning the occurrence of events. probability theory was used irrespective of whether its axioms (such as, for instance, the law of large numbers) were satisfied or not, or whether the "events" could really be described unequivocally and crisply. In the meantime one has become aware of the fact that uncertainties concerning the occurrence as well as concerning the description of events ought to be modeled in a much more differentiated way. New concepts and theories have been developed to do this: the theory of evidence, possibility theory, the theory of fuzzy sets have been advanced to a stage of remarkable maturity and have already been applied successfully in numerous cases and in many areas. Unluckily, the progress in these areas has been so fast in the last years that it has not been documented in a way which makes these results easily accessible and understandable for newcomers to these areas: text-books have not been able to keep up with the speed of new developments; edited volumes have been published which are very useful for specialists in these areas, but which are of very little use to nonspecialists because they assume too much of a background in fuzzy set theory. To a certain degree the same is true for the existing professional journals in the area of fuzzy set theory. Altogether this volume is a very important and appreciable contribution to the literature on fuzzy set theory
    corecore