459 research outputs found

    Formal Verification of Input-Output Mappings of Tree Ensembles

    Full text link
    Recent advances in machine learning and artificial intelligence are now being considered in safety-critical autonomous systems where software defects may cause severe harm to humans and the environment. Design organizations in these domains are currently unable to provide convincing arguments that their systems are safe to operate when machine learning algorithms are used to implement their software. In this paper, we present an efficient method to extract equivalence classes from decision trees and tree ensembles, and to formally verify that their input-output mappings comply with requirements. The idea is that, given that safety requirements can be traced to desirable properties on system input-output patterns, we can use positive verification outcomes in safety arguments. This paper presents the implementation of the method in the tool VoTE (Verifier of Tree Ensembles), and evaluates its scalability on two case studies presented in current literature. We demonstrate that our method is practical for tree ensembles trained on low-dimensional data with up to 25 decision trees and tree depths of up to 20. Our work also studies the limitations of the method with high-dimensional data and preliminarily investigates the trade-off between large number of trees and time taken for verification

    A Taxonomy for Management and Optimization of Multiple Resources in Edge Computing

    Full text link
    Edge computing is promoted to meet increasing performance needs of data-driven services using computational and storage resources close to the end devices, at the edge of the current network. To achieve higher performance in this new paradigm one has to consider how to combine the efficiency of resource usage at all three layers of architecture: end devices, edge devices, and the cloud. While cloud capacity is elastically extendable, end devices and edge devices are to various degrees resource-constrained. Hence, an efficient resource management is essential to make edge computing a reality. In this work, we first present terminology and architectures to characterize current works within the field of edge computing. Then, we review a wide range of recent articles and categorize relevant aspects in terms of 4 perspectives: resource type, resource management objective, resource location, and resource use. This taxonomy and the ensuing analysis is used to identify some gaps in the existing research. Among several research gaps, we found that research is less prevalent on data, storage, and energy as a resource, and less extensive towards the estimation, discovery and sharing objectives. As for resource types, the most well-studied resources are computation and communication resources. Our analysis shows that resource management at the edge requires a deeper understanding of how methods applied at different levels and geared towards different resource types interact. Specifically, the impact of mobility and collaboration schemes requiring incentives are expected to be different in edge architectures compared to the classic cloud solutions. Finally, we find that fewer works are dedicated to the study of non-functional properties or to quantifying the footprint of resource management techniques, including edge-specific means of migrating data and services.Comment: Accepted in the Special Issue Mobile Edge Computing of the Wireless Communications and Mobile Computing journa

    Towards an evidence base in the treatment of severe febrile illness in East African children

    Get PDF
    Febrile illness is the primary cause of childhood outpatient attendance, admission to hospital and death in Africa. This series of studies were aimed at ascertaining the treatable causes of infection in children admitted to a district hospital typical of those found throughout East Africa, in an area of high transmission of malaria. The studies were also designed to determine the clinical correlates of infection and predictors of mortality, looking in particular at malaria, invasive bacterial disease and HIV infection. These studies also explored to what extent clinical examination by one group of staff was replicable by another.After informed consent a detailed history and structured examination was performed on all children admitted to the hospital. Blood was drawn for culture, microscopy for malaria, HIV testing, full blood count, bedside haemoglobin, blood glucose and lactate measurement and HRP-2 based rapid diagnostic test for falciparum malaria. Outcomes were recorded at death or discharge.Sufficient data was available on 3,639 children including 184 deaths (5.1%). Invasive bacterial disease was detected in 341 children (9.4%) and HIV in 142 (3.9%). Children with HIV and those with evidence of recent malaria were significantly more likely to have invasive bacterial disease. The most common organisms isolated were non-typhi Salmonella (46.9%), Strep, pneumoniae (16.4%) and Haemophilus influenzae b (11.4%). The most frequently encountered pathogen was P. falciparum, with 2,195 children found to have asexual parasitaemia (60.3%). Falciparum parasitaemia was detected in 100 children with invasive bacterial disease (29.3%). Falciparum malaria was detected in over half (51.6%) of childhood deaths, invasive bacterial disease was documented in 31.5%.In children with a positive blood slide for malaria, WHO severe malaria criteria identified 91.6% of the children that died. A multivariate analysis showed that signs of malnutrition, respiratory distress, altered consciousness, hypoxia according to pulse oximetry, hypoglycaemia, raised blood lactate, invasive bacterial disease and female sex were all associated with an increased risk of death. In children with negative blood slides signs of malnutrition, respiratory distress, altered consciousness, hypoglycaemia, raised blood lactate and invasive bacterial disease were all independently associated with mortality by multivariate analysis.WHO defined criteria of syndromes which would warrant antibiotics predicted 56% of cases of coinfection with invasive bacterial disease and malaria and 69.7% of cases of invasive bacterial disease in slide negative children. Treating all children with severe malaria for bacterial disease would result in 71% of children with coinfection being treated. In children with negative slides including severe anaemia or prostration as syndromes requiring antibiotic therapy would have resulted in 74.7% of children with invasive bacterial disease receiving antibiotic therapy. There was moderate agreement between staff over the presence of clinical signs in children, with hospital nurses performing as well as hospital clinical officers. Agreement was better in children over 18 months of age and in children who were not crying during examination.Current WHO guidelines on antibiotic use performed poorly in this setting. Gram negative infections were the most common cause of invasive infection and many of these are likely to be resistant to penicillin and other commonly used antibiotics. Consideration should be given to expanding the indications for antibiotic use and using more broad-spectrum antibiotics in severely ill children

    Challenges to clinical research in a rural african hospital; a personal perspective from Tanzania.

    Get PDF
    UNLABELLED: This article is based on a talk given at the Japanese Society for Tropical medicine Annual Meeting in 2014. The severe febrile illness study was established in 2005. The aim of the project was to define the aetiology of febrile disease in children admitted to a hospital in Tanzania. Challenges arose in many areas: STUDY DESIGN: An initial plan to recruit only the severely ill was revised to enroll all febrile admissions leading to a more comprehensive dataset but much increased costs. Operationally a decision was made to set up a paediatric acute admissions unit (PAAU) in the hospital to facilitate recruitment and to provide appropriate initial care in line with perceived ethical obligations. This had knock on effects relating to the responsibilities that were taken on but also some unexpected positive outcomes. Study personnel: Local research staff were sometimes called upon to make up temporary shortfalls in the hospital staffing. Lack of staff made it impossible to recruit patients around the clock, seven days a week creating the challenge of ensuring representative sampling. Quality control: Studies based on clinical examination create unique quality control challenges-how to ensure that clinical staff are examining in a systematic and reproducible way. We designed a sub-study to both explore this and improve quality. SUMMARY: Setting up clinical research projects is severely resource poor settings creates many challenges including those of an operational, technical and ethical nature. Whilst there are no 'right answers' an awareness of these problems can help overcome them

    Understanding Shared Memory Bank Access Interference in Multi-Core Avionics

    Get PDF
    Deployment of multi-core platforms in safety-critical applications requires reliable estimation of worst-case response time (WCRT) for critical processes. Determination of WCRT needs to accurately estimate and measure the interferences arising from multiple processes and multiple cores. Earlier works have proposed frameworks in which CPU, shared cache, and shared memory (DRAM) interferences can be estimated using some application and platform-dependent parameters. In this work we examine a recent work in which single core equivalent (SCE) worst case execution time is used as a basis for deriving WCRT. We describe the specific requirements in an avionics context including the sharing of memory banks by multiple processes on multiple cores, and adapt the SCE framework to account for them. We present the needed adaptations to a real-time operating system to enforce the requirements, and present a methodology for validating the theoretical WCRT through measurements on the resulting platform. The work reveals that the framework indeed creates a (pessimistic) bound on the WCRT. It also discloses that the maximum interference for memory accesses does not arise when all cores share the same memory bank

    Boumediene v. Bush: Another Chapter in the Court’s Jurisprudence on Civil Liberties at Guantanamo Bay

    Get PDF
    A recent surge in the usage of instant messaging (IM) applications on mobile devices has brought the energy efficiency of these applications into focus of attention. Although IM applications are changing the message communication landscape, this work illustrates that the current versions of IM applications differ vastly in energy consumption when using the third generation (3G) cellular communication. This paper shows the interdependency between energy consumption and IM data patterns in this context. We analyse the user interaction pattern using a IM dataset, consisting of 1043370 messages collected from 51 mobile users. Based on the usage characteristics, we propose a message bundling technique that aggregates consecutive messages over time, reducing the energy consumption with a trade-off against latency. The results show that message bundling can save up to 43% in energy consumption while still maintaining the conversation function. Finally, the energy cost of a common functionality used in IM applications that informs that the user is currently typing a response, so called typing notification, is evaluated showing an energy increase ranging from 40-104%

    Point-of-care measurement of blood lactate in children admitted with febrile illness to an African District Hospital.

    Get PDF
    BACKGROUND: Lactic acidosis is a consistent predictor of mortality owing to severe infectious disease, but its detection in low-income settings is limited to the clinical sign of "deep breathing" because of the lack of accessible technology for its measurement. We evaluated the use of a point-of-care (POC) diagnostic device for blood lactate measurement to assess the severity of illness in children admitted to a district hospital in Tanzania. METHODS: Children between the ages of 2 months and 13 years with a history of fever were enrolled in the study during a period of 1 year. A full clinical history and examination were undertaken, and blood was collected for culture, microscopy, complete blood cell count, and POC measurement of blood lactate and glucose. RESULTS: The study included 3248 children, of whom 164 (5.0%) died; 45 (27.4%) of these had raised levels of blood lactate (>5 mmol/L) but no deep breathing. Compared with mortality in children with lactate levels of ≤ 3 mmol/L, the unadjusted odds of dying were 1.6 (95% confidence interval [CI].8-3.0), 3.4 (95% CI, 1.5-7.5), and 8.9 (95% CI, 4.7-16.8) in children with blood lactate levels of 3.1-5.0, 5.1-8.0, or >8.0 mmol/L, respectively. The prevalence of raised lactate levels (>5 mmol/L) was greater in children with malaria than in children with nonmalarial febrile illness (P < .001) although the associated mortality was greater in slide-negative children. CONCLUSIONS: POC lactate measurement can contribute to the assessment of children admitted to hospital with febrile illness and can also create an opportunity for more hospitals in resource-poor settings to participate in clinical trials of interventions to reduce mortality associated with hyperlactatemia

    A közfoglalkoztatás térbeli egyenlőtlenségei

    Get PDF
    In the event of a disaster, telecommunication infrastructures can be severely damaged or overloaded. Hastily formed networks can provide communication services in an ad hoc manner. These networks are challenging due to the chaotic context where intermittent connection is the norm and the identity and number of participants cannot be assumed. In such environments malicious actors may try to disrupt the communications to create more chaos for their own benefit. This paper proposes a general security framework for monitoring and reacting to disruptive attacks. It includes a collection of functions to detect anomalies, diagnose them, and perform mitigation. The measures are deployed in each node in a fully distributed fashion, but their collective impact is a significant resilience to attacks, so the actors can disseminate information under adverse conditions. The approach is evaluated in the context of a simulated disaster area network with a many-cast dissemination protocol, Random Walk Gossip, with a store-and-forward mechanism. A challenging threat model where adversaries may 1) try to drain the resources both at node level (battery life) and network level (bandwidth), or 2) reduce message dissemination in their vicinity, without spending much of their own energy, is adopted. The results demonstrate that the approach diminishes the impact of the attacks considerably.funding agencies|Swedish Civil Contingencies Agency (MSB)||national Graduate school in computer science (CUGS)||project Hastily Formed Networks|37|</p

    NetGAP: A Graph-Grammar approach for concept design of networked platforms with extra-functional requirements

    Full text link
    During the concept design of complex networked systems, concept developers have to assure that the choice of hardware modules and the topology of the target platform will provide adequate resources to support the needs of the application. For example, future-generation aerospace systems need to consider multiple requirements, with many trade-offs, foreseeing rapid technological change and a long time span for realization and service. For that purpose, we introduce NetGAP, an automated 3-phase approach to synthesize network topologies and support the exploration and concept design of networked systems with multiple requirements including dependability, security, and performance. NetGAP represents the possible interconnections between hardware modules using a graph grammar and uses a Monte Carlo Tree Search optimization to generate candidate topologies from the grammar while aiming to satisfy the requirements. We apply the proposed approach to the synthetic version of a realistic avionics application use case and show the merits of the solution to support the early-stage exploration of alternative candidate topologies. The method is shown to vividly characterize the topology-related trade-offs between requirements stemming from security, fault tolerance, timeliness, and the "cost" of adding new modules or links. Finally, we discuss the flexibility of using the approach when changes in the application and its requirements occur

    Watts2Share: Energy-Aware Traffic Consolidation

    Full text link
    Energy consumption is becoming the Achilles' heel of the mobile user quality of experience partly due to undisciplined use of the cellular (3G) transmissions by applications. The operator infrastructure is typically configured for peak performance, whereas during periods of underutilisation the handsets pay the price by staying in high energy states even if each application only uses a fraction of the maximum available bandwidth. In this paper we promote a bi-radio scenario where instead of independently using own cellular connections, several users share a single cellular link offered by one member of a coalition (a rotating aggregator). We present Watts2Share, an architecture for energy-aware traffic consolidation whereby group members' data flows transmitted through a second radio (e.g., WiFi) are aggregated by the aggregator and retransmitted through the cellular link. Through careful and repeatable studies we demonstrate that this scheme saves up to 68% of the total transmission energy in handsets compared to a pure 3G scenario. The studies are based on a wide range of real traffic traces and real cellular operator settings, and further illustrate that this scheme reduces the overall energy by reducing the signalling overhead, as well as extending the lifetime of all handsets
    corecore