6,151 research outputs found
Event Stream Processing with Multiple Threads
Current runtime verification tools seldom make use of multi-threading to
speed up the evaluation of a property on a large event trace. In this paper, we
present an extension to the BeepBeep 3 event stream engine that allows the use
of multiple threads during the evaluation of a query. Various parallelization
strategies are presented and described on simple examples. The implementation
of these strategies is then evaluated empirically on a sample of problems.
Compared to the previous, single-threaded version of the BeepBeep engine, the
allocation of just a few threads to specific portions of a query provides
dramatic improvement in terms of running time
CERN openlab Whitepaper on Future IT Challenges in Scientific Research
This whitepaper describes the major IT challenges in scientific research at CERN and several other European and international research laboratories and projects. Each challenge is exemplified through a set of concrete use cases drawn from the requirements of large-scale scientific programs. The paper is based on contributions from many researchers and IT experts of the participating laboratories and also input from the existing CERN openlab industrial sponsors. The views expressed in this document are those of the individual contributors and do not necessarily reflect the view of their organisations and/or affiliates
A Specification Language for Performance and Economical Analysis of Short Term Data Intensive Energy Management Services
Requirements of Energy Management Services include short and long term processing of data in a massively interconnected scenario. The complexity and variety of short term applications needs methodologies that allow designers to reason about the models taking into account functional and non-functional requirements. In this paper we present a component based specification language for building trustworthy continuous dataflow applications. Component behaviour is defined by Petri Nets in order to translate to the methodology all the advantages derived from a mathematically based executable model to support analysis, verification, simulation and performance evaluation. The paper illustrates how to model and reason with specifications of advanced dataflow abstractions such as smart grids
LightBox: Full-stack Protected Stateful Middlebox at Lightning Speed
Running off-site software middleboxes at third-party service providers has
been a popular practice. However, routing large volumes of raw traffic, which
may carry sensitive information, to a remote site for processing raises severe
security concerns. Prior solutions often abstract away important factors
pertinent to real-world deployment. In particular, they overlook the
significance of metadata protection and stateful processing. Unprotected
traffic metadata like low-level headers, size and count, can be exploited to
learn supposedly encrypted application contents. Meanwhile, tracking the states
of 100,000s of flows concurrently is often indispensable in production-level
middleboxes deployed at real networks.
We present LightBox, the first system that can drive off-site middleboxes at
near-native speed with stateful processing and the most comprehensive
protection to date. Built upon commodity trusted hardware, Intel SGX, LightBox
is the product of our systematic investigation of how to overcome the inherent
limitations of secure enclaves using domain knowledge and customization. First,
we introduce an elegant virtual network interface that allows convenient access
to fully protected packets at line rate without leaving the enclave, as if from
the trusted source network. Second, we provide complete flow state management
for efficient stateful processing, by tailoring a set of data structures and
algorithms optimized for the highly constrained enclave space. Extensive
evaluations demonstrate that LightBox, with all security benefits, can achieve
10Gbps packet I/O, and that with case studies on three stateful middleboxes, it
can operate at near-native speed.Comment: Accepted at ACM CCS 201
QUANTITATIVE PROTEOMIC ANALYSES OF HUMAN PLASMA: APPLICATION OF MASS SPECTROMETRY FOR THE DISCOVERY OF CLINICAL DELIRIUM BIOMARKERS
The biomarker discovery pipeline is a multi-step endeavor to identify potential diagnostic or prognostic markers of a disease. Although the advent of modern mass spectrometers has revolutionized the initial discovery phase, a significant bottleneck still exists when validating discovered biomarkers. In this doctoral research, I demonstrate that the discovery, verification and validation of biomarkers can all be performed using mass spectrometry and apply the biomarker pipeline to the context of clinical delirium.
First, a systematic review of recent literature provided a birds-eye view of untargeted, discovery proteomic attempts for biomarkers of delirium in the geriatric population. Here, a comprehensive search from five databases yielded 1172 publications, from which eight peer-reviewed studies met our defined inclusion criteria. Despite the paucity of published studies that applied systems- biology approaches for biomarker discovery on the subject, lessons learned and insights from this review was instrumental in the study designing and proteomics analyses of plasma sample in our cohort.
We then performed a targeted study on four biomarkers for their potential mediation role in the occurrence of delirium after high-dose intra-operative oxygen treatment. Although S100B calcium binding protein (S100B), gamma enolase (ENO2), chitinase-3-like protein 1 (CHI3L1) and ubiquitin carboxyl-terminal hydrolase isozyme L1 (UCHL1) have well-documented associations with delirium, we did not find any such associations in our cohort. Of note, this study demonstrates that the use of targeted approaches for the purposes of biomarker discovery, rather than an untargeted, systems-biology approach, is unavoidably biased and may lead to misleading conclusions.
Lastly, we applied lessons learned and comprehensively profiled the plasma samples of delirium cases and non-delirium cases, at both pre- and post-surgical timepoints. We found 16 biomarkers as signatures of cardiopulmonary bypass, and 11 as potential diagnostic candidates of delirium (AuROC = 93%). We validated the discovered biomarkers on the same mass spectrometry platform without the use of traditional affinity-based validation methods. Our discovery of novel biomarkers with no know association with delirium such as serum amyloid A1 (SAA1) and A2 (SAA2), pepsinogen A3 (PEPA3) and cathepsin B (CATB) shed new lights on possible neuronal pathomechanisms
A Model-Driven Approach for the Formal Verification of Storm-Based Streaming Applications
Data-intensive applications (DIAs) based on so-called Big Data technologies are nowadays a common solution adopted by IT companies to face their growing computational needs. The need for highly reliable applications able to handle huge amounts of data and the availability of infrastructures for distributed computing rapidly led industries to develop frameworks for streaming and big-data processing, like Apache Storm and Spark. The definition of methodologies and principles for good software design is, therefore, fundamental to support the development of DIAs. This paper presents an approach for non-functional analysis of DIAs through D-VerT, a tool for the architectural assessment of Storm applications. The verification is based on a translation of Storm topologies into the CLTLoc metric temporal logic. It allows the designer of a Storm application to check for the existence of components that cannot process their workload in a timely manner, typically due to an incorrect design of the topology
- …