322 research outputs found
Sustainable resource allocation for power generation: The role of big data in enabling interindustry architectural innovation
Economic, social and environmental requirements make planning for a sustainable electricity generation mix a demanding endeavour. Technological innovation offers a range of renewable generation and energy management options which require fine tuning and accurate control to be successful, which calls for the use of large-scale, detailed datasets. In this paper, we focus on the UK and use Multi-Criteria Decision Making (MCDM) to evaluate electricity generation options against technical, environmental and social criteria. Data incompleteness and redundancy, usual in large-scale datasets, as well as expert opinion ambiguity are dealt with using a comprehensive grey TOPSIS model. We used evaluation scores to develop a multi-objective optimization model to maximize the technical, environmental and social utility of the electricity generation mix and to enable a larger role for innovative technologies. Demand uncertainty was handled with an interval range and we developed our problem with multi-objective grey linear programming (MOGLP). Solving the mathematical model provided us with the electricity generation mix for every 5 min of the period under study. Our results indicate that nuclear and renewable energy options, specifically wind, solar, and hydro, but not biomass energy, perform better against all criteria indicating that interindustry architectural innovation in the power generation mix is key to sustainable UK electricity production and supply
Optimization Models Using Fuzzy Sets and Possibility Theory
Optimization is of central concern to a number of disciplines. Operations Research and Decision Theory are often considered to be identical with optimization. But also in other areas such as engineering design, regional policy, logistics and many others, the search for optimal solutions is one of the prime goals. The methods and models which have been used over the last decades in these areas have primarily been "hard" or "crisp", i.e. the solutions were considered to be either feasible or unfeasible, either above a certain aspiration level or below. This dichotomous structure of methods very often forced the modeler to approximate real problem situations of the more-or-less type by yes-or-no-type models, the solutions of which might turn out not to be the solutions to the real problems. This is particularly true if the problem under consideration includes vaguely defined relationships, human evaluations, uncertainty due to inconsistent or incomplete evidence, if natural language has to be modeled or if state variables can only be described approximately.
Until recently, everything which was not known with certainty, i.e. which was not known to be either true or false or which was not known to either happen with certainty or to be impossible to occur, was modeled by means of probabilities. This holds in particular for uncertainties concerning the occurrence of events. probability theory was used irrespective of whether its axioms (such as, for instance, the law of large numbers) were satisfied or not, or whether the "events" could really be described unequivocally and crisply.
In the meantime one has become aware of the fact that uncertainties concerning the occurrence as well as concerning the description of events ought to be modeled in a much more differentiated way. New concepts and theories have been developed to do this: the theory of evidence, possibility theory, the theory of fuzzy sets have been advanced to a stage of remarkable maturity and have already been applied successfully in numerous cases and in many areas. Unluckily, the progress in these areas has been so fast in the last years that it has not been documented in a way which makes these results easily accessible and understandable for newcomers to these areas: text-books have not been able to keep up with the speed of new developments; edited volumes have been published which are very useful for specialists in these areas, but which are of very little use to nonspecialists because they assume too much of a background in fuzzy set theory. To a certain degree the same is true for the existing professional journals in the area of fuzzy set theory.
Altogether this volume is a very important and appreciable contribution to the literature on fuzzy set theory
Computational Intelligence in Healthcare
This book is a printed edition of the Special Issue Computational Intelligence in Healthcare that was published in Electronic
Computational Intelligence in Healthcare
The number of patient health data has been estimated to have reached 2314 exabytes by 2020. Traditional data analysis techniques are unsuitable to extract useful information from such a vast quantity of data. Thus, intelligent data analysis methods combining human expertise and computational models for accurate and in-depth data analysis are necessary. The technological revolution and medical advances made by combining vast quantities of available data, cloud computing services, and AI-based solutions can provide expert insight and analysis on a mass scale and at a relatively low cost. Computational intelligence (CI) methods, such as fuzzy models, artificial neural networks, evolutionary algorithms, and probabilistic methods, have recently emerged as promising tools for the development and application of intelligent systems in healthcare practice. CI-based systems can learn from data and evolve according to changes in the environments by taking into account the uncertainty characterizing health data, including omics data, clinical data, sensor, and imaging data. The use of CI in healthcare can improve the processing of such data to develop intelligent solutions for prevention, diagnosis, treatment, and follow-up, as well as for the analysis of administrative processes. The present Special Issue on computational intelligence for healthcare is intended to show the potential and the practical impacts of CI techniques in challenging healthcare applications
Application of Optimization in Production, Logistics, Inventory, Supply Chain Management and Block Chain
The evolution of industrial development since the 18th century is now experiencing the fourth industrial revolution. The effect of the development has propagated into almost every sector of the industry. From inventory to the circular economy, the effectiveness of technology has been fruitful for industry. The recent trends in research, with new ideas and methodologies, are included in this book. Several new ideas and business strategies are developed in the area of the supply chain management, logistics, optimization, and forecasting for the improvement of the economy of the society and the environment. The proposed technologies and ideas are either novel or help modify several other new ideas. Different real life problems with different dimensions are discussed in the book so that readers may connect with the recent issues in society and industry. The collection of the articles provides a glimpse into the new research trends in technology, business, and the environment
Decision Maps for Distributed Scenario-Based Multi-Criteria Decision Support
This thesis presents the Decision Map approach to support decision-makers facing complex uncertain problems that defy standardised solutions. First, scenarios are generated in a distributed manner: the reasoning processes can be adapted to the problem at hand whilst respecting constraints in time and availability of experts. Second, by integrating scenarios and MCDA, this approach facilitates robust decision-making respecting multiple criteria in a transparent well-structured manner
Fuzzy logic based approach for object feature tracking
This thesis introduces a novel technique for feature tracking in sequences of
greyscale images based on fuzzy logic. A versatile and modular methodology
for feature tracking using fuzzy sets and inference engines is presented.
Moreover, an extension of this methodology to perform the correct tracking
of multiple features is also presented.
To perform feature tracking three membership functions are initially
defined. A membership function related to the distinctive property of the feature
to be tracked. A membership function is related to the fact of considering
that the feature has smooth movement between each image sequence and a
membership function concerns its expected future location. Applying these
functions to the image pixels, the corresponding fuzzy sets are obtained and
then mathematically manipulated to serve as input to an inference engine.
Situations such as occlusion or detection failure of features are overcome
using estimated positions calculated using a motion model and a state vector
of the feature.
This methodology was previously applied to track a single feature identified
by the user. Several performance tests were conducted on sequences of
both synthetic and real images. Experimental results are presented, analysed
and discussed. Although this methodology could be applied directly to multiple
feature tracking, an extension of this methodology has been developed
within that purpose. In this new method, the processing sequence of each
feature is dynamic and hierarchical. Dynamic because this sequence can
change over time and hierarchical because features with higher priority will
be processed first. Thus, the process gives preference to features whose location
are easier to predict compared with features whose knowledge of their
behavior is less predictable. When this priority value becomes too low, the
feature will no longer tracked by the algorithm. To access the performance
of this new approach, sequences of images where several features specified
by the user are to be tracked were used.
In the final part of this work, conclusions drawn from this work as well as
the definition of some guidelines for future research are presented.Nesta tese é introduzida uma nova técnica de seguimento de pontos caracterÃsticos de objectos em sequências de imagens em escala de cinzentos baseada em lógica difusa. É apresentada uma metodologia versátil e modular para o seguimento de objectos utilizando conjuntos difusos e motores de inferência. É também apresentada uma extensão desta metodologia para o correcto seguimento de múltiplos pontos caracterÃsticos.
Para se realizar o seguimento são definidas inicialmente três funções de pertença. Uma função de pertença está relacionada com a propriedade distintiva do objecto que desejamos seguir, outra está relacionada com o facto de se considerar que o objecto tem uma movimentação suave entre cada imagem da sequência e outra função de pertença referente à sua previsÃvel localização futura. Aplicando estas funções de pertença aos pÃxeis da imagem, obtêm-se os correspondentes conjuntos difusos, que serão manipulados matematicamente e servirão como entrada num motor de inferência. Situações como a oclusão ou falha na detecção dos pontos caracterÃsticos são ultrapassadas utilizando posições estimadas calculadas a partir do modelo de movimento e a um vector de estados do objecto.
Esta metodologia foi inicialmente aplicada no seguimento de um objecto assinalado pelo utilizador. Foram realizados vários testes de desempenho em sequências de imagens sintéticas e também reais. Os resultados experimentais obtidos são apresentados, analisados e discutidos. Embora esta metodologia pudesse ser aplicada directamente ao seguimento de múltiplos pontos caracterÃsticos, foi desenvolvida uma extensão desta metodologia para esse fim. Nesta nova metodologia a sequência de processamento de cada ponto caracterÃstico é dinâmica e hierárquica. Dinâmica por ser variável ao longo do tempo e hierárquica por existir uma hierarquia de prioridades relativamente aos pontos caracterÃsticos a serem seguidos e que determina a ordem pela qual esses pontos são processados. Desta forma, o processo dá preferência a pontos caracterÃsticos cuja localização é mais fácil de prever comparativamente a pontos caracterÃsticos cujo conhecimento do seu comportamento seja menos previsÃvel. Quando esse valor de prioridade se torna demasiado baixo, esse ponto caracterÃstico deixa de ser seguido pelo algoritmo. Para se observar o desempenho desta nova abordagem foram utilizadas sequências de imagens onde várias caracterÃsticas indicadas pelo utilizador são seguidas.
Na parte final deste trabalho são apresentadas as conclusões resultantes a partir do desenvolvimento deste trabalho, bem como a definição de algumas linhas de investigação futura
Models and Algorithms for the Optimisation of Replenishment, Production and Distribution Plans in Industrial Enterprises
Tesis por compendio[ES] La optimización en las empresas manufactureras es especialmente importante, debido a las grandes inversiones que realizan, ya que a veces estas inversiones no obtienen el rendimiento esperado porque los márgenes de beneficio de los productos son muy ajustados. Por ello, las empresas tratan de maximizar el uso de los recursos productivos y financieros minimizando el tiempo perdido y, al mismo tiempo, mejorando los flujos de los procesos y satisfaciendo las necesidades del mercado.
El proceso de planificación es una actividad crÃtica para las empresas. Esta tarea implica grandes retos debido a los cambios del mercado, las alteraciones en los procesos de producción dentro de la empresa y en la cadena de suministro, y los cambios en la legislación, entre otros.
La planificación del aprovisionamiento, la producción y la distribución desempeña un papel fundamental en el rendimiento de las empresas manufactureras, ya que una planificación ineficaz de los proveedores, los procesos de producción y los sistemas de distribución contribuye a aumentar los costes de los productos, a alargar los plazos de entrega y a reducir los beneficios. La planificación eficaz es un proceso complejo que abarca una amplia gama de actividades para garantizar que los equipos, los materiales y los recursos humanos estén disponibles en el momento y el lugar adecuados.
Motivados por la complejidad de la planificación en las empresas manufactureras, esta tesis estudia y desarrolla herramientas cuantitativas para ayudar a los planificadores en los procesos de la planificación del aprovisionamiento, producción y distribución. Desde esta perspectiva, se proponen modelos realistas y métodos eficientes para apoyar la toma de decisiones en las empresas industriales, principalmente en las pequeñas y medianas empresas (PYMES).
Las aportaciones de esta tesis suponen un avance cientÃfico basado en una exhaustiva revisión bibliográfica sobre la planificación del aprovisionamiento, la producción y la distribución que ayuda a comprender los principales modelos y algoritmos utilizados para resolver estos planes, y pone en relieve las tendencias y las futuras direcciones de investigación. También proporciona un marco holÃstico para caracterizar los modelos y algoritmos centrándose en la planificación de la producción, la programación y la secuenciación. Esta tesis también propone una herramienta de apoyo a la decisión para seleccionar un algoritmo o método de solución para resolver problemas concretos de la planificación del aprovisionamiento, producción y distribución en función de su complejidad, lo que permite a los planificadores no duplicar esfuerzos de modelización o programación de técnicas de solución. Por último, se desarrollan nuevos modelos matemáticos y enfoques de solución de última generación, como los algoritmos matheurÃsticos, que combinan la programación matemática y las técnicas metaheurÃsticas.
Los nuevos modelos y algoritmos comprenden mejoras en términos de rendimiento computacional, e incluyen caracterÃsticas realistas de los problemas del mundo real a los que se enfrentan las empresas de fabricación. Los modelos matemáticos han sido validados con un caso de una importante empresa del sector de la automoción en España, lo que ha permitido evaluar la relevancia práctica de estos novedosos modelos utilizando instancias de gran tamaño, similares a las existentes en la empresa objeto de estudio. Además, los algoritmos matheurÃsticos han sido probados utilizando herramientas libres y de código abierto. Esto también contribuye a la práctica de la investigación operativa, y proporciona una visión de cómo desplegar estos métodos de solución y el tiempo de cálculo y rendimiento de la brecha que se puede obtener mediante el uso de software libre o de código abierto.[CA] L'optimització a les empreses manufactureres és especialment important, a causa de les grans inversions que realitzen, ja que de vegades aquestes inversions no obtenen el rendiment esperat perquè els marges de benefici dels productes són molt ajustats. Per això, les empreses intenten maximitzar l'ús dels recursos productius i financers minimitzant el temps perdut i, alhora, millorant els fluxos dels processos i satisfent les necessitats del mercat.
El procés de planificació és una activitat crÃtica per a les empreses. Aquesta tasca implica grans reptes a causa dels canvis del mercat, les alteracions en els processos de producció dins de l'empresa i la cadena de subministrament, i els canvis en la legislació, entre altres.
La planificació de l'aprovisionament, la producció i la distribució té un paper fonamental en el rendiment de les empreses manufactureres, ja que una planificació ineficaç dels proveïdors, els processos de producció i els sistemes de distribució contribueix a augmentar els costos dels productes, allargar els terminis de lliurament i reduir els beneficis. La planificació eficaç és un procés complex que abasta una à mplia gamma d'activitats per garantir que els equips, els materials i els recursos humans estiguen disponibles al moment i al lloc adequats.
Motivats per la complexitat de la planificació a les empreses manufactureres, aquesta tesi estudia i desenvolupa eines quantitatives per ajudar als planificadors en els processos de la planificació de l'aprovisionament, producció i distribució. Des d'aquesta perspectiva, es proposen models realistes i mètodes eficients per donar suport a la presa de decisions a les empreses industrials, principalment a les petites i mitjanes empreses (PIMES).
Les aportacions d'aquesta tesi suposen un avenç cientÃfic basat en una exhaustiva revisió bibliogrà fica sobre la planificació de l'aprovisionament, la producció i la distribució que ajuda a comprendre els principals models i algorismes utilitzats per resoldre aquests plans, i posa de relleu les tendències i les futures direccions de recerca. També proporciona un marc holÃstic per caracteritzar els models i algorismes centrant-se en la planificació de la producció, la programació i la seqüenciació. Aquesta tesi també proposa una eina de suport a la decisió per seleccionar un algorisme o mètode de solució per resoldre problemes concrets de la planificació de l'aprovisionament, producció i distribució en funció de la seua complexitat, cosa que permet als planificadors no duplicar esforços de modelització o programació de tècniques de solució. Finalment, es desenvolupen nous models matemà tics i enfocaments de solució d'última generació, com ara els algoritmes matheurÃstics, que combinen la programació matemà tica i les tècniques metaheurÃstiques.
Els nous models i algoritmes comprenen millores en termes de rendiment computacional, i inclouen caracterÃstiques realistes dels problemes del món real a què s'enfronten les empreses de fabricació. Els models matemà tics han estat validats amb un cas d'una important empresa del sector de l'automoció a Espanya, cosa que ha permés avaluar la rellevà ncia prà ctica d'aquests nous models utilitzant instà ncies grans, similars a les existents a l'empresa objecte d'estudi. A més, els algorismes matheurÃstics han estat provats utilitzant eines lliures i de codi obert. Això també contribueix a la prà ctica de la investigació operativa, i proporciona una visió de com desplegar aquests mètodes de solució i el temps de cà lcul i rendiment de la bretxa que es pot obtindre mitjançant l'ús de programari lliure o de codi obert.[EN] Optimisation in manufacturing companies is especially important, due to the large investments they make, as sometimes these investments do not obtain the expected return because the profit margins of products are very tight. Therefore, companies seek to maximise the use of productive and financial resources by minimising lost time and, at the same time, improving process flows while meeting market needs.
The planning process is a critical activity for companies. This task involves great challenges due to market changes, alterations in production processes within the company and in the supply chain, and changes in legislation, among others.
Planning of replenishment, production and distribution plays a critical role in the performance of manufacturing companies because ineffective planning of suppliers, production processes and distribution systems contributes to higher product costs, longer lead times and less profits. Effective planning is a complex process that encompasses a wide range of activities to ensure that equipment, materials and human resources are available in the right time and the right place.
Motivated by the complexity of planning in manufacturing companies, this thesis studies and develops quantitative tools to help planners in the replenishment, production and delivery planning processes. From this perspective, realistic models and efficient methods are proposed to support decision making in industrial companies, mainly in small- and medium-sized enterprises (SMEs).
The contributions of this thesis represent a scientific breakthrough based on a comprehensive literature review about replenishment, production and distribution planning that helps to understand the main models and algorithms used to solve these plans, and highlights trends and future research directions. It also provides a holistic framework to characterise models and algorithms by focusing on production planning, scheduling and sequencing. This thesis also proposes a decision support tool for selecting an algorithm or solution method to solve concrete replenishment, production and distribution planning problems according to their complexity, which allows planners to not duplicate efforts modelling or programming solution techniques. Finally, new state-of-the-art mathematical models and solution approaches are developed, such as matheuristic algorithms, which combine mathematical programming and metaheuristic techniques.
The new models and algorithms comprise improvements in computational performance terms, and include realistic features of real-world problems faced by manufacturing companies. The mathematical models have been validated with a case of an important company in the automotive sector in Spain, which allowed to evaluate the practical relevance of these novel models using large instances, similarly to those existing in the company under study. In addition, the matheuristic algorithms have been tested using free and open-source tools. This also helps to contribute to the practice of operations research, and provides insight into how to deploy these solution methods and the computational time and gap performance that can be obtained by using free or open-source software.This work would not have been possible without the following funding sources: Conselleria de Educación, Investigación, Cultura y Deporte, Generalitat Valenciana for hiring predoctoral research staff with Grant (ACIF/2018/170) and the European Social Fund with the Grant Operational Programme of FSE 2014-2020. Conselleria de Educación, Investigación, Cultura y Deporte, Generalitat Valenciana for predoctoral contract students to stay in research centers outside the research centers outside the Valencian Community (BEFPI/2021/040) and the European Social Fund.Guzmán Ortiz, BE. (2022). Models and Algorithms for the Optimisation of Replenishment, Production and Distribution Plans in Industrial Enterprises [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/187461Compendi
Soft computing for tool life prediction a manufacturing application of neural - fuzzy systems
Tooling technology is recognised as an element of vital importance within the manufacturing industry. Critical tooling decisions related to tool selection, tool life management, optimal determination of cutting conditions and on-line machining process monitoring and control are based on the existence of reliable detailed process models. Among the decisive factors of process planning and control activities, tool wear and tool life considerations hold a dominant role. Yet, both off-line tool life prediction, as well as real tune tool wear identification and prediction are still issues open to research. The main reason lies with the large number of factors, influencing tool wear, some of them being of stochastic nature. The inherent variability of workpiece materials, cutting tools and machine characteristics, further increases the uncertainty about the machining optimisation problem. In machining practice, tool life prediction is based on the availability of data provided from tool manufacturers, machining data handbooks or from the shop floor. This thesis recognises the need for a data-driven, flexible and yet simple approach in predicting tool life. Model building from sample data depends on the availability of a sufficiently rich cutting data set. Flexibility requires a tool-life model with high adaptation capacity. Simplicity calls for a solution with low complexity and easily interpretable by the user. A neural-fuzzy systems approach is adopted, which meets these targets and predicts tool life for a wide range of turning operations. A literature review has been carried out, covering areas such as tool wear and tool life, neural networks, frizzy sets theory and neural-fuzzy systems integration. Various sources of tool life data have been examined. It is concluded that a combined use of simulated data from existing tool life models and real life data is the best policy to follow. The neurofuzzy tool life model developed is constructed by employing neural network-like learning algorithms. The trained model stores the learned knowledge in the form of frizzy IF-THEN rules on its structure, thus featuring desired transparency. Low model complexity is ensured by employing an algorithm which constructs a rule base of reduced size from the available data. In addition, the flexibility of the developed model is demonstrated by the ease, speed and efficiency of its adaptation on the basis of new tool life data. The development of the neurofuzzy tool life model is based on the Fuzzy Logic Toolbox (vl.0) of MATLAB (v4.2cl), a dedicated tool which facilitates design and evaluation of fuzzy logic systems. Extensive results are presented, which demonstrate the neurofuzzy model predictive performance. The model can be directly employed within a process planning system, facilitating the optimisation of turning operations. Recommendations aremade for further enhancements towards this direction
- …