31 research outputs found

    Calculus for decision systems

    Get PDF
    The conceptualization of the term system has become highly dependent on the application domain. What a physicist means by the term system might be different than what a sociologist means by the same term. In 1956, Bertalanffy [1] defined a system as a set of units with relationships among them . This and many other definitions of system share the idea of a system as a black box that has parts or elements interacting between each other. This means that at some level of abstraction all systems are similar, what eventually differentiates one system from another is the set of underlining equations which describe how these parts interact within the system. ^ In this dissertation we develop a framework that allows us to characterize systems from an interaction level, i.e., a framework that gives us the capability to capture how/when the elements of the system interact. This framework is a process algebra called Calculus for Decision Systems (CDS). This calculus provides means to create mathematical expressions that capture how the systems interact and react to different stimuli. It also provides the ability to formulate procedures to analyze these interactions and to further derive other interesting insights of the system. ^ After defining the syntax and reduction rules of the CDS, we develop a notion of behavioral equivalence for decision systems. This equivalence, called bisimulation, allows us to compare decision systems from the behavioral standpoint. We apply our results to games in extensive form, some physical systems, and cyber-physical systems. ^ Using the CDS for the study of games in extensive form we were able to define the concept of subgame perfect equilibrium for a two-person game with perfect information. Then, we investigate the behavior of two games played in parallel by one of the players. We also explore different couplings between games, and compare - using bisimulation - the behavior of two games that are the result of two different couplings. The results showed that, with some probability, the behavior of playing a game as first player, or second player, could be irrelevant. ^ Decision systems can be comprised by multiple decision makers. We show that in the case where two decision makers interact, we can use extensive games to represent the conflict resolution. For the case where there are more than two decision makers, we presented how to characterize the interactions between elements within an organizational structure. Organizational structures can be perceived as multiple players interacting in a game. In the context of organizational structures, we use the CDS as an information sharing mechanism to transfer the inputs and outputs from one extensive game to another. We show the suitability of our calculus for the analysis of organizational structures, and point out some potential research extensions for the analysis of organizational structures. ^ The other general area we investigate using the CDS is cyber-physical systems. Cyber-physical systems or CPS is a class of systems that are characterized by a tight relationship between systems (or processes) in the areas of computing, communication and physics. We use the CDS to describe the interaction between elements in some simple mechanical system, as well as a particular case of the generalized railroad crossing (GRC) problem, which is a typical case of CPS. We show two approaches to the solution of the GRC problem. ^ This dissertation does not intend to develop new methods to solve game theoretical problems or equations of motion of a physical system, it aims to be a seminal work towards the creation of a general framework to study systems and equivalence of systems from a formal standpoint, and to increase the applications of formal methods to real-world problems

    Collaborative Networks, Decision Systems, Web Applications and Services for Supporting Engineering and Production Management

    Get PDF
    This book focused on fundamental and applied research on collaborative and intelligent networks and decision systems and services for supporting engineering and production management, along with other kinds of problems and services. The development and application of innovative collaborative approaches and systems are of primer importance currently, in Industry 4.0. Special attention is given to flexible and cyber-physical systems, and advanced design, manufacturing and management, based on artificial intelligence approaches and practices, among others, including social systems and services

    Exploiting transitivity in probabilistic models for ontology learning

    Get PDF
    Nel natural language processing (NLP) catturare il significato delle parole è una delle sfide a cui i ricercatori sono largamente interessati. Le reti semantiche di parole o concetti, che strutturano in modo formale la conoscenza, sono largamente utilizzate in molte applicazioni. Per essere effettivamente utilizzate, in particolare nei metodi automatici di apprendimento, queste reti semantiche devono essere di grandi dimensioni o almeno strutturare conoscenza di domini molto specifici. Il nostro principale obiettivo è contribuire alla ricerca di metodi di apprendimento di reti semantiche concentrandosi in differenti aspetti. Proponiamo un nuovo modello probabilistico per creare o estendere reti semantiche che prende contemporaneamente in considerazine sia le evidenze estratte nel corpus sia la struttura della rete semantiche considerata nel training. In particolare il nostro modello durante l'apprendimento sfrutta le proprietà strutturali, come la transitività, delle relazioni che legano i nodi della nostra rete. La formulazione della probabilità che una data relazione tra due istanze appartiene alla rete semantica dipenderà da due probabilità: la probabilità diretta stimata delle evidenze del corpus e la probabilità indotta che deriva delle proprietà strutturali della relazione presa in considerazione. Il modello che proponiano introduce alcune innovazioni nella stima di queste probabilità. Proponiamo anche un modello che può essere usato per apprendere conoscenza in differenti domini di interesse senza un grande effort aggiuntivo per l'adattamento. In particolare, nell'approccio che proponiamo, si apprende un modello da un dominio generico e poi si sfrutta tale modello per estrarre nuova conoscenza in un dominio specifico. Infine proponiamo Semantic Turkey Ontology Learner (ST-OL): un sistema di apprendimento di ontologie incrementale. Mediante ontology editor, ST-OL fornisce un efficiente modo di interagire con l'utente finale e inserire le decisioni di tale utente nel loop dell'apprendimento. Inoltre il modello probabilistico integrato in ST-OL permette di sfruttare la transitività delle relazioni per indurre migliori modelli di estrazione. Mediante degli esperimenti dimostriamo che tutti i modelli che proponiamo danno un reale contributo ai differenti task che consideriamo migliorando le prestazioni.Capturing word meaning is one of the challenges of natural language processing (NLP). Formal models of meaning such as semantic networks of words or concepts are knowledge repositories used in a variety of applications. To be effectively used, these networks have to be large or, at least, adapted to specific domains. Our main goal is to contribute practically to the research on semantic networks learning models by covering different aspects of the task. We propose a novel probabilistic model for learning semantic networks that expands existing semantic networks taking into accounts both corpus-extracted evidences and the structure of the generated semantic networks. The model exploits structural properties of target relations such as transitivity during learning. The probability for a given relation instance to belong to the semantic networks of words depends both on its direct probability and on the induced probability derived from the structural properties of the target relation. Our model presents some innovations in estimating these probabilities. We also propose a model that can be used in different specific knowledge domains with a small effort for its adaptation. In this approach a model is learned from a generic domain that can be exploited to extract new informations in a specific domain. Finally, we propose an incremental ontology learning system: Semantic Turkey Ontology Learner (ST-OL). ST-OL addresses two principal issues. The first issue is an efficient way to interact with final users and, then, to put the final users decisions in the learning loop. We obtain this positive interaction using an ontology editor. The second issue is a probabilistic learning semantic networks of words model that exploits transitive relations for inducing better extraction models. ST-OL provides a graphical user interface and a human- computer interaction workflow supporting the incremental leaning loop of our learning semantic networks of words

    Neural combinatorial optimization as an enabler technology to design real-time virtual network function placement decision systems

    Get PDF
    158 p.The Fifth Generation of the mobile network (5G) represents a breakthrough technology for thetelecommunications industry. 5G provides a unified infrastructure capable of integrating over thesame physical network heterogeneous services with different requirements. This is achieved thanksto the recent advances in network virtualization, specifically in Network Function Virtualization(NFV) and Software Defining Networks (SDN) technologies. This cloud-based architecture not onlybrings new possibilities to vertical sectors but also entails new challenges that have to be solvedaccordingly. In this sense, it enables to automate operations within the infrastructure, allowing toperform network optimization at operational time (e.g., spectrum optimization, service optimization,traffic optimization). Nevertheless, designing optimization algorithms for this purpose entails somedifficulties. Solving the underlying Combinatorial Optimization (CO) problems that these problemspresent is usually intractable due to their NP-Hard nature. In addition, solutions to these problems arerequired in close to real-time due to the tight time requirements on this dynamic environment. Forthis reason, handwritten heuristic algorithms have been widely used in the literature for achievingfast approximate solutions on this context.However, particularizing heuristics to address CO problems can be a daunting task that requiresexpertise. The ability to automate this resolution processes would be of utmost importance forachieving an intelligent network orchestration. In this sense, Artificial Intelligence (AI) is envisionedas the key technology for autonomously inferring intelligent solutions to these problems. Combining AI with network virtualization can truly transform this industry. Particularly, this Thesis aims at using Neural Combinatorial Optimization (NCO) for inferring endsolutions on CO problems. NCO has proven to be able to learn near optimal solutions on classicalcombinatorial problems (e.g., the Traveler Salesman Problem (TSP), Bin Packing Problem (BPP),Vehicle Routing Problem (VRP)). Specifically, NCO relies on Reinforcement Learning (RL) toestimate a Neural Network (NN) model that describes the relation between the space of instances ofthe problem and the solutions for each of them. In other words, this model for a new instance is ableto infer a solution generalizing from the problem space where it has been trained. To this end, duringthe learning process the model takes instances from the learning space, and uses the reward obtainedfrom evaluating the solution to improve its accuracy.The work here presented, contributes to the NCO theory in two main directions. First, this workargues that the performance obtained by sequence-to-sequence models used for NCO in the literatureis improved presenting combinatorial problems as Constrained Markov Decision Processes (CMDP).Such property can be exploited for building a Markovian model that constructs solutionsincrementally based on interactions with the problem. And second, this formulation enables toaddress general constrained combinatorial problems under this framework. In this context, the modelin addition to the reward signal, relies on penalty signals generated from constraint dissatisfactionthat direct the model toward a competitive policy even in highly constrained environments. Thisstrategy allows to extend the number of problems that can be addressed using this technology.The presented approach is validated in the scope of intelligent network management, specifically inthe Virtual Network Function (VNF) placement problem. This problem consists of efficientlymapping a set of network service requests on top of the physical network infrastructure. Particularly,we seek to obtain the optimal placement for a network service chain considering the state of thevirtual environment, so that a specific resource objective is accomplished, in this case theminimization of the overall power consumption. Conducted experiments prove the capability of theproposal for learning competitive solutions when compared to classical heuristic, metaheuristic, andConstraint Programming (CP) solvers

    Soft computing techniques: Theory and application for pattern classification

    Get PDF
    Master'sMASTER OF ENGINEERIN

    Application of Business Analytics Approaches to Address Climate-Change-Related Challenges

    Get PDF
    Climate change is an existential threat facing humanity, civilization, and the natural world. It poses many multi-layered challenges that call for enhanced data-driven decision support methods to help inform society of ways to address the deep uncertainty and incomplete knowledge on climate change issues. This research primarily aims to apply management, decision, information, and data science theories and techniques to propose, build, and evaluate novel data-driven methodologies to improve understanding of climate-change-related challenges. Given that we pursue this work in the College of Management, each essay applies one or more of the three distinct business analytics approaches (i.e., descriptive, prescriptive, and predictive analysis) to aid in developing decision support capabilities. Given the rapid growth in data availability, we evaluate important data characteristics for each analysis, focusing on the data source, granularity, volume, structure, and quality. The final analysis consideration is the methods used on the data output to help coalesce the various model outputs into understandable visualizations, tables, and takeaways. We pursue three distinct business analytics challenges. First, we start with a natural language processing analysis to gain insights into the evolving climate change adaptation discussion in the scientific literature. We then create a stochastic network optimization model with recourse to provide coastal decision-makers with a cost-benefit analysis tool to simultaneously assess risks and costs to protect their community against rising seas. Finally, we create a decision support tool for helping organizations reduce greenhouse gas emissions through strategic sustainable energy purchasing. Although the three essays vary on their specific business analysis approaches, they all have a common theme of applying business analytics techniques to analyze, evaluate, visualize, and understand different facets of the climate change threat

    Women in Artificial intelligence (AI)

    Get PDF
    This Special Issue, entitled "Women in Artificial Intelligence" includes 17 papers from leading women scientists. The papers cover a broad scope of research areas within Artificial Intelligence, including machine learning, perception, reasoning or planning, among others. The papers have applications to relevant fields, such as human health, finance, or education. It is worth noting that the Issue includes three papers that deal with different aspects of gender bias in Artificial Intelligence. All the papers have a woman as the first author. We can proudly say that these women are from countries worldwide, such as France, Czech Republic, United Kingdom, Australia, Bangladesh, Yemen, Romania, India, Cuba, Bangladesh and Spain. In conclusion, apart from its intrinsic scientific value as a Special Issue, combining interesting research works, this Special Issue intends to increase the invisibility of women in AI, showing where they are, what they do, and how they contribute to developments in Artificial Intelligence from their different places, positions, research branches and application fields. We planned to issue this book on the on Ada Lovelace Day (11/10/2022), a date internationally dedicated to the first computer programmer, a woman who had to fight the gender difficulties of her times, in the XIX century. We also thank the publisher for making this possible, thus allowing for this book to become a part of the international activities dedicated to celebrating the value of women in ICT all over the world. With this book, we want to pay homage to all the women that contributed over the years to the field of AI

    Explainable Predictive and Prescriptive Process Analytics of customizable business KPIs

    Get PDF
    Recent years have witnessed a growing adoption of machine learning techniques for business improvement across various fields. Among other emerging applications, organizations are exploiting opportunities to improve the performance of their business processes by using predictive models for runtime monitoring. Predictive analytics leverages machine learning and data analytics techniques to predict the future outcome of a process based on historical data. Therefore, the goal of predictive analytics is to identify future trends, and discover potential issues and anomalies in the process before they occur, allowing organizations to take proactive measures to prevent them from happening, optimizing the overall performance of the process. Prescriptive analytics systems go beyond purely predictive ones, by not only generating predictions but also advising the user if and how to intervene in a running process in order to improve the outcome of a process, which can be defined in various ways depending on the business goals; this can involve measuring process-specific Key Performance Indicators (KPIs), such as costs, execution times, or customer satisfaction, and using this data to make informed decisions about how to optimize the process. This Ph.D. thesis research work has focused on predictive and prescriptive analytics, with particular emphasis on providing predictions and recommendations that are explainable and comprehensible to process actors. In fact, while the priority remains on giving accurate predictions and recommendations, the process actors need to be provided with an explanation of the reasons why a given process execution is predicted to behave in a certain way and they need to be convinced that the recommended actions are the most suitable ones to maximize the KPI of interest; otherwise, users would not trust and follow the provided predictions and recommendations, and the predictive technology would not be adopted.Recent years have witnessed a growing adoption of machine learning techniques for business improvement across various fields. Among other emerging applications, organizations are exploiting opportunities to improve the performance of their business processes by using predictive models for runtime monitoring. Predictive analytics leverages machine learning and data analytics techniques to predict the future outcome of a process based on historical data. Therefore, the goal of predictive analytics is to identify future trends, and discover potential issues and anomalies in the process before they occur, allowing organizations to take proactive measures to prevent them from happening, optimizing the overall performance of the process. Prescriptive analytics systems go beyond purely predictive ones, by not only generating predictions but also advising the user if and how to intervene in a running process in order to improve the outcome of a process, which can be defined in various ways depending on the business goals; this can involve measuring process-specific Key Performance Indicators (KPIs), such as costs, execution times, or customer satisfaction, and using this data to make informed decisions about how to optimize the process. This Ph.D. thesis research work has focused on predictive and prescriptive analytics, with particular emphasis on providing predictions and recommendations that are explainable and comprehensible to process actors. In fact, while the priority remains on giving accurate predictions and recommendations, the process actors need to be provided with an explanation of the reasons why a given process execution is predicted to behave in a certain way and they need to be convinced that the recommended actions are the most suitable ones to maximize the KPI of interest; otherwise, users would not trust and follow the provided predictions and recommendations, and the predictive technology would not be adopted

    Deep learning for clinical decision support in oncology

    Get PDF
    In den letzten Jahrzehnten sind medizinische Bildgebungsverfahren wie die Computertomographie (CT) zu einem unersetzbaren Werkzeug moderner Medizin geworden, welche eine zeitnahe, nicht-invasive Begutachtung von Organen und Geweben ermöglichen. Die Menge an anfallenden Daten ist dabei rapide gestiegen, allein innerhalb der letzten Jahre um den Faktor 15, und aktuell verantwortlich für 30 % des weltweiten Datenvolumens. Die Anzahl ausgebildeter Radiologen ist weitestgehend stabil, wodurch die medizinische Bildanalyse, angesiedelt zwischen Medizin und Ingenieurwissenschaften, zu einem schnell wachsenden Feld geworden ist. Eine erfolgreiche Anwendung verspricht Zeitersparnisse, und kann zu einer höheren diagnostischen Qualität beitragen. Viele Arbeiten fokussieren sich auf „Radiomics“, die Extraktion und Analyse von manuell konstruierten Features. Diese sind jedoch anfällig gegenüber externen Faktoren wie dem Bildgebungsprotokoll, woraus Implikationen für Reproduzierbarkeit und klinische Anwendbarkeit resultieren. In jüngster Zeit sind Methoden des „Deep Learning“ zu einer häufig verwendeten Lösung algorithmischer Problemstellungen geworden. Durch Anwendungen in Bereichen wie Robotik, Physik, Mathematik und Wirtschaft, wurde die Forschung im Bereich maschinellen Lernens wesentlich verändert. Ein Kriterium für den Erfolg stellt die Verfügbarkeit großer Datenmengen dar. Diese sind im medizinischen Bereich rar, da die Bilddaten strengen Anforderungen bezüglich Datenschutz und Datensicherheit unterliegen, und oft heterogene Qualität, sowie ungleichmäßige oder fehlerhafte Annotationen aufweisen, wodurch ein bedeutender Teil der Methoden keine Anwendung finden kann. Angesiedelt im Bereich onkologischer Bildgebung zeigt diese Arbeit Wege zur erfolgreichen Nutzung von Deep Learning für medizinische Bilddaten auf. Mittels neuer Methoden für klinisch relevante Anwendungen wie die Schätzung von Läsionswachtum, Überleben, und Entscheidungkonfidenz, sowie Meta-Learning, Klassifikator-Ensembling, und Entscheidungsvisualisierung, werden Wege zur Verbesserungen gegenüber State-of-the-Art-Algorithmen aufgezeigt, welche ein breites Anwendungsfeld haben. Hierdurch leistet die Arbeit einen wesentlichen Beitrag in Richtung einer klinischen Anwendung von Deep Learning, zielt auf eine verbesserte Diagnose, und damit letztlich eine verbesserte Gesundheitsversorgung insgesamt.Over the last decades, medical imaging methods, such as computed tomography (CT), have become an indispensable tool of modern medicine, allowing for a fast, non-invasive inspection of organs and tissue. Thus, the amount of acquired healthcare data has rapidly grown, increased 15-fold within the last years, and accounts for more than 30 % of the world's generated data volume. In contrast, the number of trained radiologists remains largely stable. Thus, medical image analysis, settled between medicine and engineering, has become a rapidly growing research field. Its successful application may result in remarkable time savings and lead to a significantly improved diagnostic performance. Many of the work within medical image analysis focuses on radiomics, i. e. the extraction and analysis of hand-crafted imaging features. Radiomics, however, has been shown to be highly sensitive to external factors, such as the acquisition protocol, having major implications for reproducibility and clinical applicability. Lately, deep learning has become one of the most employed methods for solving computational problems. With successful applications in diverse fields, such as robotics, physics, mathematics, and economy, deep learning has revolutionized the process of machine learning research. Having large amounts of training data is a key criterion for its successful application. These data, however, are rare within medicine, as medical imaging is subject to a variety of data security and data privacy regulations. Moreover, medical imaging data often suffer from heterogeneous quality, label imbalance, and label noise, rendering a considerable fraction of deep learning-based algorithms inapplicable. Settled in the field of CT oncology, this work addresses these issues, showing up ways to successfully handle medical imaging data using deep learning. It proposes novel methods for clinically relevant tasks, such as lesion growth and patient survival prediction, confidence estimation, meta-learning and classifier ensembling, and finally deep decision explanation, yielding superior performance in comparison to state-of-the-art approaches, and being applicable to a wide variety of applications. With this, the work contributes towards a clinical translation of deep learning-based algorithms, aiming for an improved diagnosis, and ultimately overall improved patient healthcare
    corecore