2,311 research outputs found
A Holistic Analysis of Internet of Things (IoT) Security : Principles, Practices, and New Perspectives
Peer reviewedPublisher PD
Modern computing: Vision and challenges
Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress
Adaptive Microarchitectural Optimizations to Improve Performance and Security of Multi-Core Architectures
With the current technological barriers, microarchitectural optimizations are increasingly important to ensure performance scalability of computing systems. The shift to multi-core architectures increases the demands on the memory system, and amplifies the role of microarchitectural optimizations in performance improvement. In a multi-core system, microarchitectural resources are usually shared, such as the cache, to maximize utilization but sharing can also lead to contention and lower performance. This can be mitigated through partitioning of shared caches.However, microarchitectural optimizations which were assumed to be fundamentally secure for a long time, can be used in side-channel attacks to exploit secrets, as cryptographic keys. Timing-based side-channels exploit predictable timing variations due to the interaction with microarchitectural optimizations during program execution. Going forward, there is a strong need to be able to leverage microarchitectural optimizations for performance without compromising security. This thesis contributes with three adaptive microarchitectural resource management optimizations to improve security and/or\ua0performance\ua0of multi-core architectures\ua0and a systematization-of-knowledge of timing-based side-channel attacks.\ua0We observe that to achieve high-performance cache partitioning in a multi-core system\ua0three requirements need to be met: i) fine-granularity of partitions, ii) locality-aware placement and iii) frequent changes. These requirements lead to\ua0high overheads for current centralized partitioning solutions, especially as the number of cores in the\ua0system increases. To address this problem, we present an adaptive and scalable cache partitioning solution (DELTA) using a distributed and asynchronous allocation algorithm. The\ua0allocations occur through core-to-core challenges, where applications with larger performance benefit will gain cache capacity. The\ua0solution is implementable in hardware, due to low computational complexity, and can scale to large core counts.According to our analysis, better performance can be achieved by coordination of multiple optimizations for different resources, e.g., off-chip bandwidth and cache, but is challenging due to the increased number of possible allocations which need to be evaluated.\ua0Based on these observations, we present a solution (CBP) for coordinated management of the optimizations: cache partitioning, bandwidth partitioning and prefetching.\ua0Efficient allocations, considering the inter-resource interactions and trade-offs, are achieved using local resource managers to limit the solution space.The continuously growing number of\ua0side-channel attacks leveraging\ua0microarchitectural optimizations prompts us to review attacks and defenses to understand the vulnerabilities of different microarchitectural optimizations. We identify the four root causes of timing-based side-channel attacks: determinism, sharing, access violation\ua0and information flow.\ua0Our key insight is that eliminating any of the exploited root causes, in any of the attack steps, is enough to provide protection.\ua0Based on our framework, we present a systematization of the attacks and defenses on a wide range of microarchitectural optimizations, which highlights their key similarities.\ua0Shared caches are an attractive attack surface for side-channel attacks, while defenses need to be efficient since the cache is crucial for performance.\ua0To address this issue, we present an adaptive and scalable cache partitioning solution (SCALE) for protection against cache side-channel attacks. The solution leverages randomness,\ua0and provides quantifiable and information theoretic security guarantees using differential privacy. The solution closes the performance gap to a state-of-the-art non-secure allocation policy for a mix of secure and non-secure applications
Behavior quantification as the missing link between fields: Tools for digital psychiatry and their role in the future of neurobiology
The great behavioral heterogeneity observed between individuals with the same
psychiatric disorder and even within one individual over time complicates both
clinical practice and biomedical research. However, modern technologies are an
exciting opportunity to improve behavioral characterization. Existing
psychiatry methods that are qualitative or unscalable, such as patient surveys
or clinical interviews, can now be collected at a greater capacity and analyzed
to produce new quantitative measures. Furthermore, recent capabilities for
continuous collection of passive sensor streams, such as phone GPS or
smartwatch accelerometer, open avenues of novel questioning that were
previously entirely unrealistic. Their temporally dense nature enables a
cohesive study of real-time neural and behavioral signals.
To develop comprehensive neurobiological models of psychiatric disease, it
will be critical to first develop strong methods for behavioral quantification.
There is huge potential in what can theoretically be captured by current
technologies, but this in itself presents a large computational challenge --
one that will necessitate new data processing tools, new machine learning
techniques, and ultimately a shift in how interdisciplinary work is conducted.
In my thesis, I detail research projects that take different perspectives on
digital psychiatry, subsequently tying ideas together with a concluding
discussion on the future of the field. I also provide software infrastructure
where relevant, with extensive documentation.
Major contributions include scientific arguments and proof of concept results
for daily free-form audio journals as an underappreciated psychiatry research
datatype, as well as novel stability theorems and pilot empirical success for a
proposed multi-area recurrent neural network architecture.Comment: PhD thesis cop
Synchronization of data in heterogeneous decentralized systems
Data synchronization is the problem of reconciling the differences between large data stores that differ in a small number of records. It is a common thread among disparate distributed systems ranging from fleets of Internet of Things (IoT) devices to clusters of distributed databases in the cloud. Most recently, data synchronization has arisen in globally distributed public blockchains that build the basis for the envisioned decentralized Internet of the future. Moreover, the parallel development of edge computing has significantly increased the heterogeneity of networks and computing devices. The merger of highly heterogeneous system resources and the decentralized nature of future Internet applications calls for a new approach to data synchronization. In this dissertation, we look at the problem of data synchronization through the prism of set reconciliation and introduce novel tools and protocols that improve the performance of data synchronization in heterogeneous decentralized systems.
First, we compare the analytical properties of the state-of-the-art set reconciliation protocols, and investigate the impact of theoretical assumptions and implementation decisions on the synchronization performance. Second, we introduce GenSync, the first unified set reconciliation middleware. Using GenSync's distinctive benchmarking layer, we find that the best protocol choice is highly sensitive to the system conditions, and a bad protocol choice causes a severe hit in performance. We showcase the evaluative power of GenSync in one of the world's largest wireless network emulators, and demonstrate choosing the best GenSync protocol under a high and low user mobility in an emulated cellular network. Finally, we introduce SREP (Set Reconciliation-Enhanced Propagation), a novel blockchain transaction pool synchronization protocol with quantifiable guarantees. Through simulations, we show that SREP incurs significantly smaller bandwidth overhead than a similar approach from the literature, especially in the networks of realistic sizes (tens of thousands of participants)
Verifiable Distributed Aggregation Functions
The modern Internet is built on systems that incentivize collection of information about users. In order to minimize privacy loss, it is desirable to prevent these systems from collecting more information than is required for the application. The promise of multi-party computation is that data can be aggregated without revealing individual measurements to the data collector. This work offers a provable security treatment for Verifiable Distributed Aggregation Functions (VDAFs) , a class of multi-party computation protocols being considered for standardization by the IETF.
We propose a formal framework for the analysis of VDAFs and apply it to two constructions. The first is Prio3, one of the candidates for standardization. This VDAF is based on the Prio system of Corrigan-Gibbs and Boneh (NSDI 2017). We prove that Prio3 achieves our security goals with only minor changes to the draft. The second construction, called Doplar, is introduced by this paper. Doplar is a round-reduced variant of the Poplar system of Boneh et al. (IEEE S&P 2021), itself a candidate for standardization. The cost of this improvement is a modest increase in overall bandwidth and computation
Application of knowledge management principles to support maintenance strategies in healthcare organisations
Healthcare is a vital service that touches people's lives on a daily basis by providing treatment and
resolving patients' health problems through the staff. Human lives are ultimately dependent on the skilled
hands of the staff and those who manage the infrastructure that supports the daily operations of the
service, making it a compelling reason for a dedicated research study. However, the UK healthcare sector
is undergoing rapid changes, driven by rising costs, technological advancements, changing patient
expectations, and increasing pressure to deliver sustainable healthcare. With the global rise in healthcare
challenges, the need for sustainable healthcare delivery has become imperative. Sustainable healthcare
delivery requires the integration of various practices that enhance the efficiency and effectiveness of
healthcare infrastructural assets. One critical area that requires attention is the management of
healthcare facilities.
Healthcare facilitiesis considered one of the core elements in the delivery of effective healthcare services,
as shortcomings in the provision of facilities management (FM) services in hospitals may have much more
drastic negative effects than in any other general forms of buildings. An essential element in healthcare
FM is linked to the relationship between action and knowledge. With a full sense of understanding of
infrastructural assets, it is possible to improve, manage and make buildings suitable to the needs of users
and to ensure the functionality of the structure and processes.
The premise of FM is that an organisation's effectiveness and efficiency are linked to the physical
environment in which it operates and that improving the environment can result in direct benefits in
operational performance. The goal of healthcare FM is to support the achievement of organisational
mission and goals by designing and managing space and infrastructural assets in the best combination of
suitability, efficiency, and cost. In operational terms, performance refers to how well a building
contributes to fulfilling its intended functions.
Therefore, comprehensive deployment of efficient FM approaches is essential for ensuring quality
healthcare provision while positively impacting overall patient experiences. In this regard, incorporating
knowledge management (KM) principles into hospitals' FM processes contributes significantly to ensuring
sustainable healthcare provision and enhancement of patient experiences. Organisations implementing
KM principles are better positioned to navigate the constantly evolving business ecosystem easily.
Furthermore, KM is vital in processes and service improvement, strategic decision-making, and
organisational adaptation and renewal.
In this regard, KM principles can be applied to improve hospital FM, thereby ensuring sustainable
healthcare delivery. Knowledge management assumes that organisations that manage their
organisational and individual knowledge more effectively will be able to cope more successfully with the challenges of the new business ecosystem. There is also the argument that KM plays a crucial role in
improving processes and services, strategic decision-making, and adapting and renewing an organisation.
The goal of KM is to aid action – providing "a knowledge pull" rather than the information overload most
people experience in healthcare FM. Other motivations for seeking better KM in healthcare FM include
patient safety, evidence-based care, and cost efficiency as the dominant drivers. The most evidence exists
for the success of such approaches at knowledge bottlenecks, such as infection prevention and control,
working safely, compliances, automated systems and reminders, and recall based on best practices. The
ability to cultivate, nurture and maximise knowledge at multiple levels and in multiple contexts is one of
the most significant challenges for those responsible for KM. However, despite the potential benefits,
applying KM principles in hospital facilities is still limited. There is a lack of understanding of how KM can
be effectively applied in this context, and few studies have explored the potential challenges and
opportunities associated with implementing KM principles in hospitals facilities for sustainable healthcare
delivery.
This study explores applying KM principles to support maintenance strategies in healthcare organisations.
The study also explores the challenges and opportunities, for healthcare organisations and FM
practitioners, in operationalising a framework which draws the interconnectedness between healthcare.
The study begins by defining healthcare FM and its importance in the healthcare industry. It then discusses
the concept of KM and the different types of knowledge that are relevant in the healthcare FM sector.
The study also examines the challenges that healthcare FM face in managing knowledge and how the
application of KM principles can help to overcome these challenges. The study then explores the different
KM strategies that can be applied in healthcare FM. The KM benefits include improved patient outcomes,
reduced costs, increased efficiency, and enhanced collaboration among healthcare professionals.
Additionally, issues like creating a culture of innovation, technology, and benchmarking are considered.
In addition, a framework that integrates the essential concepts of KM in healthcare FM will be presented
and discussed.
The field of KM is introduced as a complex adaptive system with numerous possibilities and challenges.
In this context, and in consideration of healthcare FM, five objectives have been formulated to achieve
the research aim. As part of the research, a number of objectives will be evaluated, including appraising
the concept of KM and how knowledge is created, stored, transferred, and utilised in healthcare FM,
evaluating the impact of organisational structure on job satisfaction as well as exploring how cultural
differences impact knowledge sharing and performance in healthcare FM organisations.
This study uses a combination of qualitative methods, such as meetings, observations, document analysis
(internal and external), and semi-structured interviews, to discover the subjective experiences of
healthcare FM employees and to understand the phenomenon within a real-world context and attitudes of healthcare FM as the data collection method, using open questions to allow probing where appropriate
and facilitating KM development in the delivery and practice of healthcare FM.
The study describes the research methodology using the theoretical concept of the "research onion". The
qualitative research was conducted in the NHS acute and non-acute hospitals in Northwest England.
Findings from the research study revealed that while the concept of KM has grown significantly in recent
years, KM in healthcare FM has received little or no attention. The target population was fifty (five FM
directors, five academics, five industry experts, ten managers, ten supervisors, five team leaders and ten
operatives). These seven groups were purposively selected as the target population because they play a
crucial role in KM enhancement in healthcare FM. Face-to-face interviews were conducted with all
participants based on their pre-determined availability. Out of the 50-target population, only 25 were
successfully interviewed to the point of saturation. Data collected from the interview were coded and
analysed using NVivo to identify themes and patterns related to KM in healthcare FM.
The study is divided into eight major sections. First, it discusses literature findings regarding healthcare
FM and KM, including underlying trends in FM, KM in general, and KM in healthcare FM. Second, the
research establishes the study's methodology, introducing the five research objectives, questions and
hypothesis. The chapter introduces the literature on methodology elements, including philosophical views
and inquiry strategies. The interview and data analysis look at the feedback from the interviews. Lastly, a
conclusion and recommendation summarise the research objectives and suggest further research.
Overall, this study highlights the importance of KM in healthcare FM and provides insights for healthcare
FM directors, managers, supervisors, academia, researchers and operatives on effectively leveraging
knowledge to improve patient care and organisational effectiveness
Securely extending and running low-code applications with C#
Low-code development platforms provide an accessible infrastructure for the
creation of software by domain experts, also called "citizen developers",
without the need for formal programming education. Development is facilitated
through graphical user interfaces, although traditional programming can still
be used to extend low-code applications, for example when external services or
complex business logic needs to be implemented that cannot be realized with the
features available on a platform. Since citizen developers are usually not
specifically trained in software development, they require additional support
when writing code, particularly with regard to security and advanced techniques
like debugging or versioning. In this thesis, several options to assist
developers of low-code applications are investigated and implemented. A
framework to quickly build code editor extensions is developed, and an approach
to leverage the Roslyn compiler platform to implement custom static code
analysis rules for low-code development platforms using the .NET platform is
demonstrated. Furthermore, a sample application showing how Roslyn can be used
to build a simple, integrated debugging tool, as well as an abstraction of the
version control system Git for easier usage by citizen developers, is
implemented. Security is a critical aspect when low-code applications are
deployed. To provide an overview over possible options to ensure the secure and
isolated execution of low-code applications, a threat model is developed and
used as the basis for a comparison between OS-level virtualization, sandboxing,
and runtime code security implementations
Specificity of the innate immune responses to different classes of non-tuberculous mycobacteria
Mycobacterium avium is the most common nontuberculous mycobacterium (NTM) species causing infectious disease. Here, we characterized a M. avium infection model in zebrafish larvae, and compared it to M. marinum infection, a model of tuberculosis. M. avium bacteria are efficiently phagocytosed and frequently induce granuloma-like structures in zebrafish larvae. Although macrophages can respond to both mycobacterial infections, their migration speed is faster in infections caused by M. marinum. Tlr2 is conservatively involved in most aspects of the defense against both mycobacterial infections. However, Tlr2 has a function in the migration speed of macrophages and neutrophils to infection sites with M. marinum that is not observed with M. avium. Using RNAseq analysis, we found a distinct transcriptome response in cytokine-cytokine receptor interaction for M. avium and M. marinum infection. In addition, we found differences in gene expression in metabolic pathways, phagosome formation, matrix remodeling, and apoptosis in response to these mycobacterial infections. In conclusion, we characterized a new M. avium infection model in zebrafish that can be further used in studying pathological mechanisms for NTM-caused diseases
- …