21 research outputs found
Security Technologies and Methods for Advanced Cyber Threat Intelligence, Detection and Mitigation
The rapid growth of the Internet interconnectivity and complexity of communication systems has led us to a significant growth of cyberattacks globally often with severe and disastrous consequences. The swift development of more innovative and effective (cyber)security solutions and approaches are vital which can detect, mitigate and prevent from these serious consequences. Cybersecurity is gaining momentum and is scaling up in very many areas. This book builds on the experience of the Cyber-Trust EU projectâs methods, use cases, technology development, testing and validation and extends into a broader science, lead IT industry market and applied research with practical cases. It offers new perspectives on advanced (cyber) security innovation (eco) systems covering key different perspectives. The book provides insights on new security technologies and methods for advanced cyber threat intelligence, detection and mitigation. We cover topics such as cyber-security and AI, cyber-threat intelligence, digital forensics, moving target defense, intrusion detection systems, post-quantum security, privacy and data protection, security visualization, smart contracts security, software security, blockchain, security architectures, system and data integrity, trust management systems, distributed systems security, dynamic risk management, privacy and ethics
Mouldable Solids: Exploring Organisational Grid Strategies to Enhance Mud Architecture
Mud is a material with deep origins in human ecology and vernacular architecture. Despite housing one-third of the worldâs population and almost half in developing countries, the application of mud as a building material has diminished over the years, perhaps due to a worldwide application of industrialised building materials and practices, as well as the perception of mud as a primitive material. On the contrary, mud is cheap, reusable and sustainable yet critical challenges relate to material behaviour and performance. The researcher takes the standpoint that mud architecture is a material practice and explores organisational grids consisting of skin and skeleton to enhance structural performance.
Three areas of interest combine to demonstrate how mud as a material operates in a contemporary context: (1) The Natural Philosophy of Aristotle and ibn Sina to understand the transitional state of matter and force-form relations; (2) Isaac Newtonâs Laws of Motion and Hookeâs Law to understand force-displacement relationships; (3) Information theory to represent parameters and conditions as information in organisational strategies. While mud is of interest, other materials explored include plastic, concrete, clay, and adobe as they categorise as mouldable solids due to their transitional states. Where a careful focus on mud regarding material, form, motion and force, the research deploys the technical with the philosophical to negotiate the capacities of this particular mouldable solid. The hypothesis is that the greater the variance in the skin and skeleton grid, the better the resilience and adaptability a body has due to the complex interconnections between the parts that make up a whole, organising and re-organising to withstand forces.
The dissertation celebrates mud as a reconfigurable architectural material rather than static and outdated, allowing for a multi-approach solution to contemporary and standardised materials in the current industrialised context
Research and innovation in network and traffic management systems in Europe
Adequate research and innovation (R&I) is paramount for the seamless testing, adoption and integration of network and traffic management systems. This report provides a comprehensive analysis of R&I initiatives in Europe in this field. The assessment follows the methodology developed by the European Commissionâs Transport Research and Innovation Monitoring and Information System (TRIMIS). The report critically addresses research by thematic area and technologies, highlighting recent developments and future needs.JRC.C.4-Sustainable Transpor
Ultrasensitive detection of toxocara canis excretory-secretory antigens by a nanobody electrochemical magnetosensor assay.
peer reviewedHuman Toxocariasis (HT) is a zoonotic disease caused by the migration
of the larval stage of the roundworm Toxocara canis in the human host.
Despite of being the most cosmopolitan helminthiasis worldwide, its
diagnosis is elusive. Currently, the detection of specific immunoglobulins
IgG against the Toxocara Excretory-Secretory Antigens (TES), combined
with clinical and epidemiological criteria is the only strategy to diagnose
HT. Cross-reactivity with other parasites and the inability to distinguish
between past and active infections are the main limitations of this
approach. Here, we present a sensitive and specific novel strategy to
detect and quantify TES, aiming to identify active cases of HT. High
specificity is achieved by making use of nanobodies (Nbs), recombinant
single variable domain antibodies obtained from camelids, that due to
their small molecular size (15kDa) can recognize hidden epitopes not
accessible to conventional antibodies. High sensitivity is attained by the
design of an electrochemical magnetosensor with an amperometric readout
with all components of the assay mixed in one single step. Through
this strategy, 10-fold higher sensitivity than a conventional sandwich
ELISA was achieved. The assay reached a limit of detection of 2 and15
pg/ml in PBST20 0.05% or serum, spiked with TES, respectively. These
limits of detection are sufficient to detect clinically relevant toxocaral
infections. Furthermore, our nanobodies showed no cross-reactivity
with antigens from Ascaris lumbricoides or Ascaris suum. This is to our
knowledge, the most sensitive method to detect and quantify TES so far,
and has great potential to significantly improve diagnosis of HT. Moreover,
the characteristics of our electrochemical assay are promising for the
development of point of care diagnostic systems using nanobodies as a
versatile and innovative alternative to antibodies. The next step will be the
validation of the assay in clinical and epidemiological contexts
Characterization and optimization of network traffic in cortical simulation
Considering the great variety of obstacles the Exascale systems
have to face in the next future, a deeper attention will be given in this thesis
to the interconnect and the power consumption.
The data movement challenge involves the whole hierarchical organization
of components in HPC systems â i.e. registers, cache, memory, disks.
Running scientific applications needs to provide the most effective methods
of data transport among the levels of hierarchy. On current petaflop systems,
memory access at all the levels is the limiting factor in almost all applications.
This drives the requirement for an interconnect achieving adequate rates of
data transfer, or throughput, and reducing time delays, or latency, between
the levels.
Power consumption is identified as the largest hardware research challenge.
The annual power cost to operate the system would be above 2.5 B$
per year for an Exascale system using current technology. The research for alternative
power-efficient computing device is mandatory for the procurement
of the future HPC systems.
In this thesis, a preliminary approach will be offered to the critical process of
co-design. Co-desing is defined as the simultaneos design of both hardware
and software, to implement a desired function. This process both integrates
all components of the Exascale initiative and illuminates the trade-offs that
must be made within this complex undertaking
Optimising the potential of mindfulness programs in schools: Learning from implementation science
There is a growing need for the provision of mental health services for young people in schools. A number of evidence-based practices (EBPs) now exist for schools to choose from to address their pupilsâ mental health needs. However, when such EBPs are introduced into schools, their effectiveness can be lacking and weakened. Implementation science suggests that without effective implementation strategies, the success of EBPs in schools may be limited. The transfer of knowledge into practice is a difficult and challenging process, often referred to as the âscience to service gapâ. To support the mental health of young people, there is a need not just for EBPs but also for evidence-based implementation.
Mindfulness training (MT) is a promising intervention for young people that is currently being introduced to a number of schools across the UK, and internationally. The primary aim of this doctoral work was to understand and examine MT implementation experiences in order to identify the determinants of, and potential ways to promote, the early implementation stages of MT in schools. The first study in this doctoral work examined how far a knowledge broker, sharing implementation related knowledge, could impact the implementation decisions made by a steering group (SG) responsible for implementing a mindfulness program across schools in Cumbria, UK. SG meetings were attended for 14 months and meeting minutes, notes and audio recordings were recorded and analysed for âkey momentsâ and âkey outcomesâ. A second related analysis of this SG activity explored, via interviews and thematic analysis, the perceived opportunities and barriers for the SG to act as an implementation team. Study 3 aimed to identify the determinants of MT early implementation success in five secondary schools by using the Consolidated Framework for Implementation Research (CFIR). Interviews were conducted with school staff responsible for implementing MT at two time points across 6 months. The schoolsâ implementation progress was recorded, and the CFIR was used to code the data for 38 implementation constructs. Usefulness of the CFIR was assessed. Finally, in Study 4 the findings of the previous studies were synthesised with the implementation science literature to inform the development of a preliminary implementation framework to promotes the successful implementation of MT in (secondary) schools in order to improve their usefulness in such complex settings.
Findings from Study 1 and 2 suggested that SGs responsible for implementing school public health programs can learn about implementation and then apply this new knowledge to their program. Sharing knowledge with stakeholders responsible for implementing public health programs may be a viable and effective implementation promotion strategy. Having a strong engagement strategy and good relationships with schools can facilitate this process. SGs influence over general school capacity and external funding may be limited and hinder their ability to impact overall implementation. More work is needed to understand how SGs may be empowered to influence general capacity, funding, and have better linkages to other stakeholders involved in their programâs overall provision.
Findings from Study 3 indicated that there are a number of implementation related constructs which seem to distinguish between schools which implement MT well and schools which do not. The CFIR was a useful tool for identifying the barriers and facilitator to EBPs in schools and which barriers and facilitators seem to distinguish implementation success between schools the most. School leadership plays a pivotal role in ensuring implementation success. Who should be solely responsible for the successful implementation of EBPs in schools is less clear but it may be that a concerted effort on the part of program designers, program funders and school leadership might be required to ensure programs are implemented well. Study 4 indicated that implementation frameworks designed specifically for school leaders are likely to be useful but what motivates school leaders to use them is less clear. Further research into ways of promoting the use of implementation guidance by school leaders is needed
Recommended from our members
Projecting in Space-Time: The Laboratory Method, Modern Architecture and Settlement-Building, 1918-1932.
Between 1918 and 1932, a number of European modern architects described their work as âscientifically managedâ or âtaylorizedâ, and as âlaboratory workâ or âpractical experimentsâ, all of which were approaches attributable to the principles of organization used in American industry. Scholars would later dismiss these claims as âideologicalâ or âpropagandisticâ, since many of the architectural works of this period were in fact neither fabricated like industrial products nor did they perform as efficiently. However, relying on recent scholarship regarding the history of American industrial organization between 1880 and 1918, this dissertation reassesses the claims of these architects, revealing a more nuanced and thorough comprehension of the principles of American industrial organization, particularly scientific management, than has been previously acknowledged. While many modern architects admired the tools, products and spaces of industry, a select group also showed interest in scientific managementâs central ontological theory, the âlaboratory methodâ, which called for the fusion of inquiry and material production within a single space. While the laboratory method is most closely associated with Frederick Taylor, who developed this approach specifically for use in the industrial plant, it was Frank Gilbreth, who, by 1918, had translated this theory for use in a different space of production, the construction site. Frank and his partner, Lillian Gilbreth, developed a âmulti sensoryâ approach to projecting processes in âspace-timeâ, one that combined orthographic projection with data mapping and new media, such as photography and film. Their âvisualization theoryâ offered modern architects assistance in an already defined design problem, namely the projection of architectural artifacts at the scale of the pre-modern urban unit, the village or settlement, with the intricacy of a pre-modern manufactured product, such as a door or window, all while considering the perception of a moving subject. Utilizing the principles of modern management, architects sought to rationalize their own âmental workâ, the production of drawing sets, as well as to participate in the bureaucratization or standardization of material parameters and social conventions, occurring at the municipal, national and international scales, during this period.
While an interest in scientific management among interwar architects was widespread, this dissertation will show that there were few actual examples of the application of these principles to the process of architectural production; the most notable examples were those conducted by Peter Behrens (1918-1920), Le Corbusier (1923-26), Martin Wagner (1924-1929), Walter Gropius (1926-1929) and Ernst May (1926-1930). In all five cases, the primary goals were the same as they had been for Taylor and Gilbreth, the derivation of novel tentative standard methods, and not solely increase in the efficiency of material production. The application of the laboratory method to settlement-building by these architects was not revolutionary so much as it was evolutionary, with Hermann Muthesiusâ notion of typological evolution and adaptation, summarized in Kleinhaus und Kleinsiedlung (1920), as well as a set of projection instruments included in Raymond Unwinâs design manual, Town Planning in Practice (1909), providing a crucial foundation for the interwar work. This interwar work was further informed by a series of American experiments in industrialized settlement-building, including the Atterbury, Harms and Small, and Unit Systems.
The laboratory method and visualization theory of scientific management required a particular balance of control and feedback, which proved difficult to achieve in architectural production, helping to explain the relatively few applications of these principles. Expanding conjecture from the atelier onto the construction site and into use itself, exposed architects to a myriad of problems that they were not entirely equipped to handle. The unique context of Weimar Germany afforded architects like Wagner, Gropius and May a framework that combined the degree of bureaucratization necessary to support experimentation without the âover-bureaucratizationâ that would define the postwar period. A similar framework of control and feedback afforded a team of architects, working within in Zagreb, Yugoslavia, between 1957-1964, an opportunity for applying the laboratory method to architectural production. This work would in turn attract the attention of an international group of artists and theorists, the New Tendencies movement (1961-1973), who saw in it the architectural equivalent of âprogrammed artâ. As one of the most frequently cited books at these conferences, Norbert Wiener, explained in 1952, âthe notion of programingâ was itself rooted in the âwork of Taylor and the Gilbreths on time studyâ, before it was âtransferred to the machineâ. This research will serve to show that modern architects had translated the principles of industrial organization well before programing became digitized.Architecture, Landscape Architecture and Urban Plannin
Development of a read mapping analysis software and computational pan genome analysis of 20 Pseudomonas aeruginosa strains
Hilker R. Development of a read mapping analysis software and computational pan genome analysis of 20 Pseudomonas aeruginosa strains. Bielefeld: Bielefeld University; 2015.In times of multi-resistant pathogenic bacteria, their detailed study is of utmost importance. Their comparative analysis can even aid the emerging field of personalized medicine by enabling optimized treatment depending on the presence of virulence factors and antibiotic resistances in the infection concerned. The weaknesses and functionality of these pathogenic bacteria can be investigated using modern computer science and novel sequencing technologies. One of these methods is the bioinformatics evaluation of high-throughput sequencing data.
A pathogenic bacterium posing severe health care issues is the ubiquitous Pseudomonas aeruginosa. It is involved in a wide range of infections mainly affecting the pulmonary or urinary tract, open wounds and burns. The prevalence of chronic obstructive pulmonary disease cases with P. aeruginosa in Germany alone is ~600,000 per year. Within the framework of this dissertation, computational comparative genomics experiments were conducted with a panel of 20 of the most abundant Pseudomonas aeruginosa strains. 15 of these strains were isolated from clinical cases, while the remaining 5 were strains without a known infection history isolated from the environment. This division was chosen to enable direct comparison of the pathogenic potential of clinical and environmental strains and identification of their possible characteristic differences.
When designing the bioinformatics experiments and searching for an efficient visualization and automatic analysis platform for read alignment (mapping) data, it became evident that no adequate solution was available that included all required functionalities. On these grounds, the decision was made to define two main subjects for this dissertation.
Besides the P. aeruginosa pan genome analysis, a novel read mapping visualization and analysis software was developed and published in the journal Bioinformatics. This software - ReadXplorer - is partly based upon a prototype, which was developed during a preceding master's thesis at the Center for Biotechnology of the Bielefeld University under the name VAMP. The software was developed into a comprehensive user-friendly platform augmented with several newly developed and implemented automatic bioinformatics read mapping analyses. Two examples of these are the transcription start site detection and the single nucleotide polymorphism detection. Moreover, new intuitive visualizations were added to the existent ones and existing visualizations were greatly enhanced. ReadXplorer is designed to support not only DNA-seq data as accrued in the P. aeruginosa experiments, but also any kind of standard read mapping data as obtained from RNA-seq or ChIP-seq experiments. The data management was designed to comply with the latest performance and efficiency needs emerging from the large next generation sequencing data sets. Finally, ReadXplorer was empowered to deal with eukaryotic read mapping data as well.
Amongst other software, ReadXplorer was then used to analyze different comparative genomics aspects of P. aeruginosa and to draw conclusions regarding the development of their pathogenicity. The list of conducted experiments includes phylogeny and gene set determination, analysis of regions of genomic plasticity and identification of single nucleotide polymorphisms. The achieved results were published in the journal Environmental Biology