5,046 research outputs found
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
The evaluation of dynamic human-computer interaction
This thesis describes the development and evaluation of a theoretical framework to account for the
dynamic aspects of behaviour at the Human-Computer Interface (HCIF). The purpose behind this
work is to allow for the consideration of dynamic Human-Computer Interaction (HCI) in the design
of interactive computer systems, and to facilitate the generation of design tools for this purpose.
The work describes an example of a design tool which demonstrates how designers of interactive
computer systems may account for some aspects of the dynamics of behaviour, involved with the
use of computers, in the design of new interactive systems. The thesis offers empirical and literary
evidence to support the validity of the dynamic factors governing the interaction of humans with
computers
Proceedings of the 4th bwHPC Symposium
The bwHPC Symposium 2017 took place on October 4th, 2017, Alte Aula, Tübingen. It focused on the presentation of scientific computing projects as well as on the progress and the success stories of the bwHPC realization concept. The event offered a unique opportunity to engage in an active dialogue between scientific users, operators of bwHPC sites, and the bwHPC support team
True single-cell proteomics using advanced ion mobility mass spectrometry
In this thesis, I present the development of a novel mass spectrometry (MS) platform and scan modes in conjunction with a versatile and robust liquid chromatography (LC) platform, which addresses current sensitivity and robustness limitations in MS-based proteomics. I demonstrate how this technology benefits the high-speed and ultra-high sensitivity proteomics studies on a large scale. This culminated in the first of its kind label-free MS-based single-cell proteomics platform and its application to spatial tissue proteomics. I also investigate the vastly underexplored ‘dark matter’ of the proteome, validating novel microproteins that contribute to human cellular function.
First, we developed a novel trapped ion mobility spectrometry (TIMS) platform for proteomics applications, which multiplies sequencing speed and sensitivity by ‘parallel accumulation – serial fragmentation’ (PASEF) and applied it to first high-sensitivity and large-scale projects in the biomedical arena. Next, to explore the collisional cross section (CCS) dimension in TIMS, we measured over 1 million peptide CCS values, which enabled us to train a deep learning model for CCS prediction solely based on the linear amino acid sequence. We also translated the principles of TIMS and PASEF to the field of lipidomics, highlighting parallel benefits in terms of throughput and sensitivity.
The core of my PhD is the development of a robust ultra-high sensitivity LC-MS platform for the high-throughput analysis of single-cell proteomes. Improvements in ion transfer efficiency, robust, very low flow LC and a PASEF data independent acquisition scan mode together increased measurement sensitivity by up to 100-fold. We quantified single-cell proteomes to a depth of up to 1,400 proteins per cell. A fundamental result from the comparisons to single-cell RNA sequencing data revealed that single cells have a stable core proteome, whereas the transcriptome is dominated by Poisson noise, emphasizing the need for both complementary technologies.
Building on our achievements with the single-cell proteomics technology, we elucidated the image-guided spatial and cell-type resolved proteome in whole organs and tissues from minute sample amounts. We combined clearing of rodent and human organs, unbiased 3D-imaging, target tissue identification, isolation and MS-based unbiased proteomics to describe early-stage β-amyloid plaque proteome profiles in a disease model of familial Alzheimer’s. Automated artificial intelligence driven isolation and pooling of single cells of the same phenotype allowed us to analyze the cell-type resolved proteome of cancer tissues, revealing a remarkable spatial difference in the proteome.
Last, we systematically elucidated pervasive translation of noncanonical human open reading frames combining state-of-the art ribosome profiling, CRISPR screens, imaging and MS-based proteomics. We performed unbiased analysis of small novel proteins and prove their physical existence by LC-MS as HLA peptides, essential interaction partners of protein complexes and cellular function
A Distributed Security Architecture for Large Scale Systems
This thesis describes the research leading from the conception, through development, to the practical
implementation of a comprehensive security architecture for use within, and as a value-added enhancement
to, the ISO Open Systems Interconnection (OSI) model.
The Comprehensive Security System (CSS) is arranged basically as an Application Layer service but can
allow any of the ISO recommended security facilities to be provided at any layer of the model. It is
suitable as an 'add-on' service to existing arrangements or can be fully integrated into new applications.
For large scale, distributed processing operations, a network of security management centres (SMCs) is
suggested, that can help to ensure that system misuse is minimised, and that flexible operation is provided
in an efficient manner.
The background to the OSI standards are covered in detail, followed by an introduction to security in open
systems. A survey of existing techniques in formal analysis and verification is then presented. The
architecture of the CSS is described in terms of a conceptual model using agents and protocols, followed
by an extension of the CSS concept to a large scale network controlled by SMCs.
A new approach to formal security analysis is described which is based on two main methodologies.
Firstly, every function within the system is built from layers of provably secure sequences of finite state
machines, using a recursive function to monitor and constrain the system to the desired state at all times.
Secondly, the correctness of the protocols generated by the sequences to exchange security information
and control data between agents in a distributed environment, is analysed in terms of a modified temporal
Hoare logic. This is based on ideas concerning the validity of beliefs about the global state of a system
as a result of actions performed by entities within the system, including the notion of timeliness.
The two fundamental problems in number theory upon which the assumptions about the security of the
finite state machine model rest are described, together with a comprehensive survey of the very latest
progress in this area. Having assumed that the two problems will remain computationally intractable in
the foreseeable future, the method is then applied to the formal analysis of some of the components of the
Comprehensive Security System.
A practical implementation of the CSS has been achieved as a demonstration system for a network of IBM
Personal Computers connected via an Ethernet LAN, which fully meets the aims and objectives set out
in Chapter 1. This implementation is described, and finally some comments are made on the possible
future of research into security aspects of distributed systems.IBM (United Kingdom) Laboratories
Hursley Park, Winchester, U
A study of the design expertise for plants handling hazardous materials
A study of the design expertise for plants handling hazardous material
Human Factors Considerations in System Design
Human factors considerations in systems design was examined. Human factors in automated command and control, in the efficiency of the human computer interface and system effectiveness are outlined. The following topics are discussed: human factors aspects of control room design; design of interactive systems; human computer dialogue, interaction tasks and techniques; guidelines on ergonomic aspects of control rooms and highly automated environments; system engineering for control by humans; conceptual models of information processing; information display and interaction in real time environments
Recommended from our members
Damage and repair identification in reinforced concrete beams modelled with various damage scenarios using vibration data
This research aims at developing a novel vibration-based damage identification technique that can efficiently be applied to real-time large data for detection, classification, localisation and quantification of the potential structural damage
- …