426 research outputs found

    Current Studies and Applications of Krill Herd and Gravitational Search Algorithms in Healthcare

    Full text link
    Nature-Inspired Computing or NIC for short is a relatively young field that tries to discover fresh methods of computing by researching how natural phenomena function to find solutions to complicated issues in many contexts. As a consequence of this, ground-breaking research has been conducted in a variety of domains, including synthetic immune functions, neural networks, the intelligence of swarm, as well as computing of evolutionary. In the domains of biology, physics, engineering, economics, and management, NIC techniques are used. In real-world classification, optimization, forecasting, and clustering, as well as engineering and science issues, meta-heuristics algorithms are successful, efficient, and resilient. There are two active NIC patterns: the gravitational search algorithm and the Krill herd algorithm. The study on using the Krill Herd Algorithm (KH) and the Gravitational Search Algorithm (GSA) in medicine and healthcare is given a worldwide and historical review in this publication. Comprehensive surveys have been conducted on some other nature-inspired algorithms, including KH and GSA. The various versions of the KH and GSA algorithms and their applications in healthcare are thoroughly reviewed in the present article. Nonetheless, no survey research on KH and GSA in the healthcare field has been undertaken. As a result, this work conducts a thorough review of KH and GSA to assist researchers in using them in diverse domains or hybridizing them with other popular algorithms. It also provides an in-depth examination of the KH and GSA in terms of application, modification, and hybridization. It is important to note that the goal of the study is to offer a viewpoint on GSA with KH, particularly for academics interested in investigating the capabilities and performance of the algorithm in the healthcare and medical domains.Comment: 35 page

    An Evolutionary Approach to Adaptive Image Analysis for Retrieving and Long-term Monitoring Historical Land Use from Spatiotemporally Heterogeneous Map Sources

    Get PDF
    Land use changes have become a major contributor to the anthropogenic global change. The ongoing dispersion and concentration of the human species, being at their orders unprecedented, have indisputably altered Earth’s surface and atmosphere. The effects are so salient and irreversible that a new geological epoch, following the interglacial Holocene, has been announced: the Anthropocene. While its onset is by some scholars dated back to the Neolithic revolution, it is commonly referred to the late 18th century. The rapid development since the industrial revolution and its implications gave rise to an increasing awareness of the extensive anthropogenic land change and led to an urgent need for sustainable strategies for land use and land management. By preserving of landscape and settlement patterns at discrete points in time, archival geospatial data sources such as remote sensing imagery and historical geotopographic maps, in particular, could give evidence of the dynamic land use change during this crucial period. In this context, this thesis set out to explore the potentials of retrospective geoinformation for monitoring, communicating, modeling and eventually understanding the complex and gradually evolving processes of land cover and land use change. Currently, large amounts of geospatial data sources such as archival maps are being worldwide made online accessible by libraries and national mapping agencies. Despite their abundance and relevance, the usage of historical land use and land cover information in research is still often hindered by the laborious visual interpretation, limiting the temporal and spatial coverage of studies. Thus, the core of the thesis is dedicated to the computational acquisition of geoinformation from archival map sources by means of digital image analysis. Based on a comprehensive review of literature as well as the data and proposed algorithms, two major challenges for long-term retrospective information acquisition and change detection were identified: first, the diversity of geographical entity representations over space and time, and second, the uncertainty inherent to both the data source itself and its utilization for land change detection. To address the former challenge, image segmentation is considered a global non-linear optimization problem. The segmentation methods and parameters are adjusted using a metaheuristic, evolutionary approach. For preserving adaptability in high level image analysis, a hybrid model- and data-driven strategy, combining a knowledge-based and a neural net classifier, is recommended. To address the second challenge, a probabilistic object- and field-based change detection approach for modeling the positional, thematic, and temporal uncertainty adherent to both data and processing, is developed. Experimental results indicate the suitability of the methodology in support of land change monitoring. In conclusion, potentials of application and directions for further research are given

    Assessing hyper parameter optimization and speedup for convolutional neural networks

    Get PDF
    The increased processing power of graphical processing units (GPUs) and the availability of large image datasets has fostered a renewed interest in extracting semantic information from images. Promising results for complex image categorization problems have been achieved using deep learning, with neural networks comprised of many layers. Convolutional neural networks (CNN) are one such architecture which provides more opportunities for image classification. Advances in CNN enable the development of training models using large labelled image datasets, but the hyper parameters need to be specified, which is challenging and complex due to the large number of parameters. A substantial amount of computational power and processing time is required to determine the optimal hyper parameters to define a model yielding good results. This article provides a survey of the hyper parameter search and optimization methods for CNN architectures

    Artificial Intelligence for autonomous persona generation to shape tailored communications and products and incentivise disaster preparation behaviours

    Get PDF
    Elizabeth Ditton investigated whether machine learning, specifically clustering algorithms, could be used to mimic expert decision making used for targeted disaster preparation messaging. She found that clustering algorithms could be used to develop personas that achieve the same level of depth and nuance as manually developed personas, without the required resources

    Soft computing applied to optimization, computer vision and medicine

    Get PDF
    Artificial intelligence has permeated almost every area of life in modern society, and its significance continues to grow. As a result, in recent years, Soft Computing has emerged as a powerful set of methodologies that propose innovative and robust solutions to a variety of complex problems. Soft Computing methods, because of their broad range of application, have the potential to significantly improve human living conditions. The motivation for the present research emerged from this background and possibility. This research aims to accomplish two main objectives: On the one hand, it endeavors to bridge the gap between Soft Computing techniques and their application to intricate problems. On the other hand, it explores the hypothetical benefits of Soft Computing methodologies as novel effective tools for such problems. This thesis synthesizes the results of extensive research on Soft Computing methods and their applications to optimization, Computer Vision, and medicine. This work is composed of several individual projects, which employ classical and new optimization algorithms. The manuscript presented here intends to provide an overview of the different aspects of Soft Computing methods in order to enable the reader to reach a global understanding of the field. Therefore, this document is assembled as a monograph that summarizes the outcomes of these projects across 12 chapters. The chapters are structured so that they can be read independently. The key focus of this work is the application and design of Soft Computing approaches for solving problems in the following: Block Matching, Pattern Detection, Thresholding, Corner Detection, Template Matching, Circle Detection, Color Segmentation, Leukocyte Detection, and Breast Thermogram Analysis. One of the outcomes presented in this thesis involves the development of two evolutionary approaches for global optimization. These were tested over complex benchmark datasets and showed promising results, thus opening the debate for future applications. Moreover, the applications for Computer Vision and medicine presented in this work have highlighted the utility of different Soft Computing methodologies in the solution of problems in such subjects. A milestone in this area is the translation of the Computer Vision and medical issues into optimization problems. Additionally, this work also strives to provide tools for combating public health issues by expanding the concepts to automated detection and diagnosis aid for pathologies such as Leukemia and breast cancer. The application of Soft Computing techniques in this field has attracted great interest worldwide due to the exponential growth of these diseases. Lastly, the use of Fuzzy Logic, Artificial Neural Networks, and Expert Systems in many everyday domestic appliances, such as washing machines, cookers, and refrigerators is now a reality. Many other industrial and commercial applications of Soft Computing have also been integrated into everyday use, and this is expected to increase within the next decade. Therefore, the research conducted here contributes an important piece for expanding these developments. The applications presented in this work are intended to serve as technological tools that can then be used in the development of new devices

    Optimización del diseño estructural de pavimentos asfálticos para calles y carreteras

    Get PDF
    gráficos, tablasThe construction of asphalt pavements in streets and highways is an activity that requires optimizing the consumption of significant economic and natural resources. Pavement design optimization meets contradictory objectives according to the availability of resources and users’ needs. This dissertation explores the application of metaheuristics to optimize the design of asphalt pavements using an incremental design based on the prediction of damage and vehicle operating costs (VOC). The costs are proportional to energy and resource consumption and polluting emissions. The evolution of asphalt pavement design and metaheuristic optimization techniques on this topic were reviewed. Four computer programs were developed: (1) UNLEA, a program for the structural analysis of multilayer systems. (2) PSO-UNLEA, a program that uses particle swarm optimization metaheuristic (PSO) for the backcalculation of pavement moduli. (3) UNPAVE, an incremental pavement design program based on the equations of the North American MEPDG and includes the computation of vehicle operating costs based on IRI. (4) PSO-PAVE, a PSO program to search for thicknesses that optimize the design considering construction and vehicle operating costs. The case studies show that the backcalculation and structural design of pavements can be optimized by PSO considering restrictions in the thickness and the selection of materials. Future developments should reduce the computational cost and calibrate the pavement performance and VOC models. (Texto tomado de la fuente)La construcción de pavimentos asfálticos en calles y carreteras es una actividad que requiere la optimización del consumo de cuantiosos recursos económicos y naturales. La optimización del diseño de pavimentos atiende objetivos contradictorios de acuerdo con la disponibilidad de recursos y las necesidades de los usuarios. Este trabajo explora el empleo de metaheurísticas para optimizar el diseño de pavimentos asfálticos empleando el diseño incremental basado en la predicción del deterioro y los costos de operación vehicular (COV). Los costos son proporcionales al consumo energético y de recursos y las emisiones contaminantes. Se revisó la evolución del diseño de pavimentos asfálticos y el desarrollo de técnicas metaheurísticas de optimización en este tema. Se desarrollaron cuatro programas de computador: (1) UNLEA, programa para el análisis estructural de sistemas multicapa. (2) PSO-UNLEA, programa que emplea la metaheurística de optimización con enjambre de partículas (PSO) para el cálculo inverso de módulos de pavimentos. (3) UNPAVE, programa de diseño incremental de pavimentos basado en las ecuaciones de la MEPDG norteamericana, y el cálculo de costos de construcción y operación vehicular basados en el IRI. (4) PSO-PAVE, programa que emplea la PSO en la búsqueda de espesores que permitan optimizar el diseño considerando los costos de construcción y de operación vehicular. Los estudios de caso muestran que el cálculo inverso y el diseño estructural de pavimentos pueden optimizarse mediante PSO considerando restricciones en los espesores y la selección de materiales. Los desarrollos futuros deben enfocarse en reducir el costo computacional y calibrar los modelos de deterioro y COV.DoctoradoDoctor en Ingeniería - Ingeniería AutomáticaDiseño incremental de pavimentosEléctrica, Electrónica, Automatización Y Telecomunicacione
    corecore