5,492 research outputs found
Towards A Practical High-Assurance Systems Programming Language
Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation.
Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code.
To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process
Evaluation Methodologies in Software Protection Research
Man-at-the-end (MATE) attackers have full control over the system on which
the attacked software runs, and try to break the confidentiality or integrity
of assets embedded in the software. Both companies and malware authors want to
prevent such attacks. This has driven an arms race between attackers and
defenders, resulting in a plethora of different protection and analysis
methods. However, it remains difficult to measure the strength of protections
because MATE attackers can reach their goals in many different ways and a
universally accepted evaluation methodology does not exist. This survey
systematically reviews the evaluation methodologies of papers on obfuscation, a
major class of protections against MATE attacks. For 572 papers, we collected
113 aspects of their evaluation methodologies, ranging from sample set types
and sizes, over sample treatment, to performed measurements. We provide
detailed insights into how the academic state of the art evaluates both the
protections and analyses thereon. In summary, there is a clear need for better
evaluation methodologies. We identify nine challenges for software protection
evaluations, which represent threats to the validity, reproducibility, and
interpretation of research results in the context of MATE attacks
Beam scanning by liquid-crystal biasing in a modified SIW structure
A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium
An empirical investigation of the relationship between integration, dynamic capabilities and performance in supply chains
This research aimed to develop an empirical understanding of the relationships between integration,
dynamic capabilities and performance in the supply chain domain, based on which, two conceptual
frameworks were constructed to advance the field. The core motivation for the research was that, at
the stage of writing the thesis, the combined relationship between the three concepts had not yet
been examined, although their interrelationships have been studied individually.
To achieve this aim, deductive and inductive reasoning logics were utilised to guide the qualitative
study, which was undertaken via multiple case studies to investigate lines of enquiry that would
address the research questions formulated. This is consistent with the author’s philosophical
adoption of the ontology of relativism and the epistemology of constructionism, which was considered
appropriate to address the research questions. Empirical data and evidence were collected, and
various triangulation techniques were employed to ensure their credibility. Some key features of
grounded theory coding techniques were drawn upon for data coding and analysis, generating two
levels of findings. These revealed that whilst integration and dynamic capabilities were crucial in
improving performance, the performance also informed the former. This reflects a cyclical and
iterative approach rather than one purely based on linearity. Adopting a holistic approach towards
the relationship was key in producing complementary strategies that can deliver sustainable supply
chain performance.
The research makes theoretical, methodological and practical contributions to the field of supply
chain management. The theoretical contribution includes the development of two emerging
conceptual frameworks at the micro and macro levels. The former provides greater specificity, as it
allows meta-analytic evaluation of the three concepts and their dimensions, providing a detailed
insight into their correlations. The latter gives a holistic view of their relationships and how they are
connected, reflecting a middle-range theory that bridges theory and practice. The methodological
contribution lies in presenting models that address gaps associated with the inconsistent use of
terminologies in philosophical assumptions, and lack of rigor in deploying case study research
methods. In terms of its practical contribution, this research offers insights that practitioners could
adopt to enhance their performance. They can do so without necessarily having to forgo certain
desired outcomes using targeted integrative strategies and drawing on their dynamic capabilities
A model of actors and grey failures
Existing models for the analysis of concurrent processes tend to focus on
fail-stop failures, where processes are either working or permanently stopped,
and their state (working/stopped) is known. In fact, systems are often affected
by grey failures: failures that are latent, possibly transient, and may affect
the system in subtle ways that later lead to major issues (such as crashes,
limited availability, overload). We introduce a model of actor-based systems
with grey failures, based on two interlinked layers: an actor model, given as
an asynchronous process calculus with discrete time, and a failure model that
represents failure patterns to inject in the system. Our failure model captures
not only fail-stop node and link failures, but also grey failures (e.g.,
partial, transient). We give a behavioural equivalence relation based on weak
barbed bisimulation to compare systems on the basis of their ability to recover
from failures, and on this basis we define some desirable properties of
reliable systems. By doing so, we reduce the problem of checking reliability
properties of systems to the problem of checking bisimulation
Recommended from our members
Ensuring Access to Safe and Nutritious Food for All Through the Transformation of Food Systems
Exploring the Training Factors that Influence the Role of Teaching Assistants to Teach to Students With SEND in a Mainstream Classroom in England
With the implementation of inclusive education having become increasingly valued over the years, the training of Teaching Assistants (TAs) is now more important than ever, given that they work alongside pupils with special educational needs and disabilities (hereinafter SEND) in mainstream education classrooms. The current study explored the training factors that influence the role of TAs when it comes to teaching SEND students in mainstream classrooms in England during their one-year training period. This work aimed to increase understanding of how the training of TAs is seen to influence the development of their personal knowledge and professional skills. The study has significance for our comprehension of the connection between the TAs’ training and the quality of education in the classroom. In addition, this work investigated whether there existed a correlation between the teaching experience of TAs and their background information, such as their gender, age, grade level taught, years of teaching experience, and qualification level.
A critical realist theoretical approach was adopted for this two-phased study, which involved the mixing of adaptive and grounded theories respectively. The multi-method project featured 13 case studies, each of which involved a trainee TA, his/her college tutor, and the classroom teacher who was supervising the trainee TA. The analysis was based on using semi-structured interviews, various questionnaires, and non-participant observation methods for each of these case studies during the TA’s one-year training period. The primary analysis of the research was completed by comparing the various kinds of data collected from the participants in the first and second data collection stages of each case. Further analysis involved cross-case analysis using a grounded theory approach, which made it possible to draw conclusions and put forth several core propositions. Compared with previous research, the findings of the current study reveal many implications for the training and deployment conditions of TAs, while they also challenge the prevailing approaches in many aspects, in addition to offering more diversified, enriched, and comprehensive explanations of the critical pedagogical issues
- …