18 research outputs found

    Efficient configuration space construction and optimization

    Get PDF
    The configuration space is a fundamental concept that is widely used in algorithmic robotics. Many applications in robotics, computer-aided design, and related areas can be reduced to computational problems in terms of configuration spaces. In this dissertation, we address three main computational challenges related to configuration spaces: 1) how to efficiently compute an approximate representation of high-dimensional configuration spaces; 2) how to efficiently perform geometric, proximity, and motion planning queries in high dimensional configuration spaces; and 3) how to model uncertainty in configuration spaces represented by noisy sensor data. We present new configuration space construction algorithms based on machine learning and geometric approximation techniques. These algorithms perform collision queries on many configuration samples. The collision query results are used to compute an approximate representation for the configuration space, which quickly converges to the exact configuration space. We highlight the efficiency of our algorithms for penetration depth computation and instance-based motion planning. We also present parallel GPU-based algorithms to accelerate the performance of optimization and search computations in configuration spaces. In particular, we design efficient GPU-based parallel k-nearest neighbor and parallel collision detection algorithms and use these algorithms to accelerate motion planning. In order to extend configuration space algorithms to handle noisy sensor data arising from real-world robotics applications, we model the uncertainty in the configuration space by formulating the collision probabilities for noisy data. We use these algorithms to perform reliable motion planning for the PR2 robot.Doctor of Philosoph

    Economics and the Complexity Vision: Chimerical Partners or Elysian Adventurers?

    Get PDF
    This work began as a review article of: "Complexity and the History of Economic Thought", edited by David Colander, Routledge, London,UK, 2000; & "The Complexity Vision and the Teaching of Economics", edited by David Colander, Edward Elgar, Cheltenham, UK, 2000. It has, in the writing, developed into my own vision of complexity economics

    TĂ©cnicas big data para el procesamiento de flujos de datos masivos en tiempo real

    Get PDF
    Programa de Doctorado en Biotecnología, Ingeniería y Tecnología QuímicaLínea de Investigación: Ingeniería, Ciencia de Datos y BioinformáticaClave Programa: DBICódigo Línea: 111Machine learning techniques have become one of the most demanded resources by companies due to the large volume of data that surrounds us in these days. The main objective of these technologies is to solve complex problems in an automated way using data. One of the current perspectives of machine learning is the analysis of continuous flows of data or data streaming. This approach is increasingly requested by enterprises as a result of the large number of information sources producing time-indexed data at high frequency, such as sensors, Internet of Things devices, social networks, etc. However, nowadays, research is more focused on the study of historical data than on data received in streaming. One of the main reasons for this is the enormous challenge that this type of data presents for the modeling of machine learning algorithms. This Doctoral Thesis is presented in the form of a compendium of publications with a total of 10 scientific contributions in International Conferences and journals with high impact index in the Journal Citation Reports (JCR). The research developed during the PhD Program focuses on the study and analysis of real-time or streaming data through the development of new machine learning algorithms. Machine learning algorithms for real-time data consist of a different type of modeling than the traditional one, where the model is updated online to provide accurate responses in the shortest possible time. The main objective of this Doctoral Thesis is the contribution of research value to the scientific community through three new machine learning algorithms. These algorithms are big data techniques and two of them work with online or streaming data. In this way, contributions are made to the development of one of the current trends in Artificial Intelligence. With this purpose, algorithms are developed for descriptive and predictive tasks, i.e., unsupervised and supervised learning, respectively. Their common idea is the discovery of patterns in the data. The first technique developed during the dissertation is a triclustering algorithm to produce three-dimensional data clusters in offline or batch mode. This big data algorithm is called bigTriGen. In a general way, an evolutionary metaheuristic is used to search for groups of data with similar patterns. The model uses genetic operators such as selection, crossover, mutation or evaluation operators at each iteration. The goal of the bigTriGen is to optimize the evaluation function to achieve triclusters of the highest possible quality. It is used as the basis for the second technique implemented during the Doctoral Thesis. The second algorithm focuses on the creation of groups over three-dimensional data received in real-time or in streaming. It is called STriGen. Streaming modeling is carried out starting from an offline or batch model using historical data. As soon as this model is created, it starts receiving data in real-time. The model is updated in an online or streaming manner to adapt to new streaming patterns. In this way, the STriGen is able to detect concept drifts and incorporate them into the model as quickly as possible, thus producing triclusters in real-time and of good quality. The last algorithm developed in this dissertation follows a supervised learning approach for time series forecasting in real-time. It is called StreamWNN. A model is created with historical data based on the k-nearest neighbor or KNN algorithm. Once the model is created, data starts to be received in real-time. The algorithm provides real-time predictions of future data, keeping the model always updated in an incremental way and incorporating streaming patterns identified as novelties. The StreamWNN also identifies anomalous data in real-time allowing this feature to be used as a security measure during its application. The developed algorithms have been evaluated with real data from devices and sensors. These new techniques have demonstrated to be very useful, providing meaningful triclusters and accurate predictions in real time.Universidad Pablo de Olavide de Sevilla. Departamento de Deporte e informátic

    Deep Learning in Medical Image Analysis

    Get PDF
    The accelerating power of deep learning in diagnosing diseases will empower physicians and speed up decision making in clinical environments. Applications of modern medical instruments and digitalization of medical care have generated enormous amounts of medical images in recent years. In this big data arena, new deep learning methods and computational models for efficient data processing, analysis, and modeling of the generated data are crucially important for clinical applications and understanding the underlying biological process. This book presents and highlights novel algorithms, architectures, techniques, and applications of deep learning for medical image analysis

    Ordonnancement de projets internationaux avec contraintes de matériel et de ressources

    Get PDF
    RÉSUMÉ L’évolution des processus d’affaires, rendue nécessaire par une globalisation des marchés toujours plus importante, a encouragé les firmes internationales à se tourner vers le fonctionnement par projet, qui s’est peu à peu généralisé. Mais la gestion de projets demeure une discipline complexe, impliquant de nombreux acteurs, sous-traitants et parties prenantes, et nécessitant le transport, l’approvisionnement et la livraison d’équipement et de matériel. Dans ce contexte, des délais d’approvisionnement et des contraintes de capacité de stockage peuvent amener des retards dans l’avancement des projets et des dépassements de budget. En effet, en dépit des tous les efforts accomplis en recherche pour développer des outils efficaces, la prise en compte des contraintes de livraison et d’approvisionnement lors de la phase de planification de projets demeure pour l’essentiel gérée manuellement par le gestionnaire de projets, selon son intuition et son expérience. Ainsi, les méthodes et algorithmes développés ne sont utilisés en pratique que pour la gestion dite « traditionnelle » de projet, ne considérant qu’un unique projet de faible portée, sans contraintes logistiques. Reconnaissant cette nouvelle réalité et les besoins qui en découlent, ce mémoire envisage une approche de résolution plus intégrée où l’on considère le problème de planification de projets avec des contraintes liées au stockage et à la livraison de matériel. Pour cela, un générateur aléatoire de contraintes logistiques a été développé, permettant de définir les données du problème relatives aux contraintes logistiques, tout en jouant sur leur poids relatif. La résolution du problème ainsi formulé est effectuée par le biais d’un algorithme génétique optimisé afin de déterminer un échéancier de projet réalisable. L’algorithme, par des opérations de sélection, croisement et mutation sur une population de solutions admissibles, améliore globalement la qualité de celle-ci au fil des itérations et converge peu à peu vers un optimum.---------- ABSTRACT With globalization of markets, new business models have emerged and international companies started using project management technics. However, the management of international projects remains complex as it involves several sub-contractors and requires the transportation of lots of construction equipment and materials. In that context, long material delivery times and storage capacity constraints may lead to project delays and budget overruns. Indeed, in spite of all research efforts accomplished to develop strong project management tools, project plans taking into account space and equipment availability for the execution of tasks are mostly manually developed on the basis of planner’s intuition and experience. Unfortunately, this task is nearly infeasible to perform in the case of large projects, due to the combinational nature of the resource allocation problem. Methods and algorithms are only used for project management in a “traditional” way, with a single small project and without any logistic constraint. The proposed model addresses this issue by formulating this problem as a scheduling problem with limited resources - resources can be either employees or storage areas - and by defining material delivery constraints. To that purpose, a random logistic constraints generator was developed, in order to create the problem data and to choose its relative weight. The resolution of the formulated problem is performed by a genetic algorithm, which determines a feasible project plan. This algorithm applies many operators on a population of solutions, such as selection crossover and mutation, to improve the population global quality and make the solutions converge towards a local optimum

    Book of abstracts

    Get PDF

    Actas de las VI Jornadas Nacionales (JNIC2021 LIVE)

    Get PDF
    Estas jornadas se han convertido en un foro de encuentro de los actores más relevantes en el ámbito de la ciberseguridad en España. En ellas, no sólo se presentan algunos de los trabajos científicos punteros en las diversas áreas de ciberseguridad, sino que se presta especial atención a la formación e innovación educativa en materia de ciberseguridad, y también a la conexión con la industria, a través de propuestas de transferencia de tecnología. Tanto es así que, este año se presentan en el Programa de Transferencia algunas modificaciones sobre su funcionamiento y desarrollo que han sido diseñadas con la intención de mejorarlo y hacerlo más valioso para toda la comunidad investigadora en ciberseguridad

    A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    No full text
    Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS) is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO), reinforcement learning (RL) and ant colony optimization (ACO) show the efficiency of this novel method
    corecore