168 research outputs found

    RiskNet: neural risk assessment in networks of unreliable resources

    Get PDF
    We propose a graph neural network (GNN)-based method to predict the distribution of penalties induced by outages in communication networks, where connections are protected by resources shared between working and backup paths. The GNN-based algorithm is trained only with random graphs generated on the basis of the Barabási–Albert model. However, the results obtained show that we can accurately model the penalties in a wide range of existing topologies. We show that GNNs eliminate the need to simulate complex outage scenarios for the network topologies under study—in practice, the entire time of path placement evaluation based on the prediction is no longer than 4 ms on modern hardware. In this way, we gain up to 12 000 times in speed improvement compared to calculations based on simulations.This work was supported by the Polish Ministry of Science and Higher Education with the subvention funds of the Faculty of Computer Science, Electronics and Telecommunications of AGH University of Science and Technology (P.B., P.C.) and by the PL-Grid Infrastructure (K.R.).Peer ReviewedPostprint (published version

    Taking Computation to Data: Integrating Privacy-preserving AI techniques and Blockchain Allowing Secure Analysis of Sensitive Data on Premise

    Get PDF
    PhD thesis in Information technologyWith the advancement of artificial intelligence (AI), digital pathology has seen significant progress in recent years. However, the use of medical AI raises concerns about patient data privacy. The CLARIFY project is a research project funded under the European Union’s Marie Sklodowska-Curie Actions (MSCA) program. The primary objective of CLARIFY is to create a reliable, automated digital diagnostic platform that utilizes cloud-based data algorithms and artificial intelligence to enable interpretation and diagnosis of wholeslide-images (WSI) from any location, maximizing the advantages of AI-based digital pathology. My research as an early stage researcher for the CLARIFY project centers on securing information systems using machine learning and access control techniques. To achieve this goal, I extensively researched privacy protection technologies such as federated learning, differential privacy, dataset distillation, and blockchain. These technologies have different priorities in terms of privacy, computational efficiency, and usability. Therefore, we designed a computing system that supports different levels of privacy security, based on the concept: taking computation to data. Our approach is based on two design principles. First, when external users need to access internal data, a robust access control mechanism must be established to limit unauthorized access. Second, it implies that raw data should be processed to ensure privacy and security. Specifically, we use smart contractbased access control and decentralized identity technology at the system security boundary to ensure the flexibility and immutability of verification. If the user’s raw data still cannot be directly accessed, we propose to use dataset distillation technology to filter out privacy, or use locally trained model as data agent. Our research focuses on improving the usability of these methods, and this thesis serves as a demonstration of current privacy-preserving and secure computing technologies

    Health risk from pm2.5 and ozone in Melbourne : present and projected futures

    Get PDF
    In Australia, adverse health impacts of air pollution have still been reported despite relatively low pollution levels. There is a growing concern that climate change may have a detrimental effect on future air quality and impose additional air pollution-related health burdens. This thesis aimed to assess the impacts of future climate change on the health effects of air pollution in the Melbourne Region. Firstly, the best approach was identified to estimate baseline relative risks for cardiovascular and respiratory mortality and emergency department (ED) visits associated with exposure to particulate matter with diameters of less than 2.5 micrometres (fine PM) and ozone. A Poisson regression model was fitted to estimate the baseline relative risks by using data during 1999-2008. A blending approach in which air pollution and weather data were generated by merging simulations and measurements was selected as it provided larger risk estimates, thought to indicate lower measurement error. The blending approach was used further to explore whether there was evidence of a modifying effect of temperature on the air pollution-related health risks. The Poisson model previously developed was modified to include an interaction term for temperature strata. Non-uniform relative risks for fine PM- and ozone-related respiratory ED visits across the temperature range were observed, suggesting the effect modification of temperature. Next, the baseline relative risks earlier estimated were used to predict changes in air pollution-related respiratory ED visits induced by climate change in the Port Phillip Region between 2065-2074 and 1996-2005. Two methods, with and without inclusion of the temperature modifying effect, were used. Without the temperature modifying effect, exposure to future ozone modified by the changing climate was estimated to cause an additional 60 to 110 respiratory ED visits over three summer months. An increase of 15 to 26 respiratory ED visits over the summer period was estimated due to exposure to fine PM. In winter, a reduction in fine PM-related respiratory ED visits was estimated based on three of four global circulation models. When using the method taking account of the temperature modifying effect, an increase in estimates of fine PM-related respiratory ED visits for summer and winter between the two periods was predicted, with a greater magnitude compared to the estimates predicted by the other method. A little difference in ozone-related respiratory morbidity between the two decades was predicted for summer with inclusion of the temperature modifying effect. Finally, the sensitivity of the estimated changes in respiratory morbidity to uncertainties in non-climate factors was explored. Among the factors examined, future population growth appeared to be the largest contributor to the variation in estimated changes in the future health effects of air pollution. This thesis provides a systematic investigation of the health risks of air pollution under the effects of future climate change, including a careful analysis of the sensitivity of the results to uncertainties. Assessing the health impacts of extreme air pollution episodes under climate change is an important area, not examined here, that should be the focus of future studies

    The bridge of dreams::Towards a method for operational performance alignment in IT-enabled service supply chains

    Get PDF
    Concerns on performance alignment, especially on business-IT alignment, have been around for three decades. It is still considered to be one of the most important driving forces for business success, as well as one of the top concerns of many practitioners and organizational researchers. It is also found to be a major issue in two thirds of digital transformation projects. Many attempts from researchers in diverse disciplines have been made to tackle this issue. Unfortunately, they have been working separately and the research appears in various forms and names. This dissertation presents a piece of interdisciplinary research that focuses on identifying operational performance alignment issues, discovering and assessing their root causes with attention to the dynamics in operating IT-enabled service supply chain (SSC). It makes a modest contribution by providing a communication-centred instrument which can modularize complex SSC in terms of a hierarchically-structured set of services and analyze the performance causality between them. With a special focus on the impact of IT, it makes it possible to monitor and tune various performance issues in SSC. This research intends to provide a solution-oriented common ground where multiple service research streams can meet together. Following the framework proposed in this research, services, at different tiers of an SSC, are modelled with a balanced perspective on both business, technical service components and KPIs. It allows a holistic picture of service performances and interactions throughout the entire supply chain to be viewed through a different research lens and permits the causal impact of technology, business strategy, and service operations on supply chain performance to be unveiled

    A manifesto for future generation cloud computing: research directions for the next decade

    Get PDF
    The Cloud computing paradigm has revolutionised the computer science horizon during the past decade and has enabled the emergence of computing as the fifth utility. It has captured significant attention of academia, industries, and government bodies. Now, it has emerged as the backbone of modern economy by offering subscription-based services anytime, anywhere following a pay-as-you-go model. This has instigated (1) shorter establishment times for start-ups, (2) creation of scalable global enterprise applications, (3) better cost-to-value associativity for scientific and high performance computing applications, and (4) different invocation/execution models for pervasive and ubiquitous applications. The recent technological developments and paradigms such as serverless computing, software-defined networking, Internet of Things, and processing at network edge are creating new opportunities for Cloud computing. However, they are also posing several new challenges and creating the need for new approaches and research strategies, as well as the re-evaluation of the models that were developed to address issues such as scalability, elasticity, reliability, security, sustainability, and application models. The proposed manifesto addresses them by identifying the major open challenges in Cloud computing, emerging trends, and impact areas. It then offers research directions for the next decade, thus helping in the realisation of Future Generation Cloud Computing

    Epitope-specific antibody responses differentiate COVID-19 outcomes and variants of concern

    Get PDF
    BACKGROUND. The role of humoral immunity in COVID-19 is not fully understood, owing, in large part, to the complexity of antibodies produced in response to the SARS-CoV-2 infection. There is a pressing need for serology tests to assess patient-specific antibody response and predict clinical outcome. METHODS. Using SARS-CoV-2 proteome and peptide microarrays, we screened 146 COVID-19 patients’ plasma samples to identify antigens and epitopes. This enabled us to develop a master epitope array and an epitope-specific agglutination assay to gauge antibody responses systematically and with high resolution. RESULTS. We identified linear epitopes from the spike (S) and nucleocapsid (N) proteins and showed that the epitopes enabled higher resolution antibody profiling than the S or N protein antigen. Specifically, we found that antibody responses to the S-811–825, S-881–895, and N-156–170 epitopes negatively or positively correlated with clinical severity or patient survival. Moreover, we found that the P681H and S235F mutations associated with the coronavirus variant of concern B.1.1.7 altered the specificity of the corresponding epitopes. CONCLUSION. Epitope-resolved antibody testing not only affords a high-resolution alternative to conventional immunoassays to delineate the complex humoral immunity to SARS-CoV-2 and differentiate between neutralizing and non-neutralizing antibodies, but it also may potentially be used to predict clinical outcome. The epitope peptides can be readily modified to detect antibodies against variants of concern in both the peptide array and latex agglutination formats. FUNDING. Ontario Research Fund (ORF) COVID-19 Rapid Research Fund, Toronto COVID-19 Action Fund, Western University, Lawson Health Research Institute, London Health Sciences Foundation, and Academic Medical Organization of Southwestern Ontario (AMOSO) Innovation Fund

    Specification and unattended deployment of home networks at the edge of the network

    Get PDF
    Consumer devices continue to expand their capabilities by connecting to digital services and other devices to form information-sharing ecosystems. This is complex and requires meeting connection requirements and minimal processing capabilities to ensure communication. The emergence of new services, and the evolution of current technologies, constantly redefine the rules of the game by opening up new possibilities and increasing competition among service providers. Paradigms such as edge computing, softwarization of physical devices, self-configuration mechanisms, definition of software as a code and interoperability between devices, define design principles to be taken into account in future service infrastructures. This work analyzes these principles and presents a programmable architecture in which services and virtual devices are instantiated in any computing infrastructure, as cloud or edge computing, upon request according to the needs specified by service providers or users. Considering that the target computing infrastructures are heterogeneous, the solution defines network elements and provides network templates to ensure it can be deployed on different infrastructures irrespectively of the vendor. A prototype has been developed and tested on a virtualized cloud-based home network relying on open source solutions.This work was supported in part by the Project MAGOS under Grant TEC2017-84197- C4-1-R; in part by the Comunidad de Madrid (Spain) through the Project CYNAMON under Grant P2018/TCS-4566; and in part by the European Structural Funds (ESF and FEDER)

    Service-oriented models for audiovisual content storage

    No full text
    What are the important topics to understand if involved with storage services to hold digital audiovisual content? This report takes a look at how content is created and moves into and out of storage; the storage service value networks and architectures found now and expected in the future; what sort of data transfer is expected to and from an audiovisual archive; what transfer protocols to use; and a summary of security and interface issues
    • …
    corecore