220 research outputs found
Operational Research: methods and applications
This is the final version. Available on open access from Taylor & Francis via the DOI in this recordThroughout its history, Operational Research has evolved to include methods, models and algorithms that have been applied to a wide range of contexts. This encyclopedic article consists of two main sections: methods and applications. The first summarises the up-to-date knowledge and provides an overview of the state-of-the-art methods and key developments in the various subdomains of the field. The second offers a wide-ranging list of areas where Operational Research has been applied. The article is meant to be read in a nonlinear fashion and used as a point of reference by a diverse pool of readers: academics, researchers, students, and practitioners. The entries within the methods and applications sections are presented in alphabetical order. The authors dedicate this paper to the 2023 Turkey/Syria earthquake victims. We sincerely hope that advances in OR will play a role towards minimising the pain and suffering caused by this and future catastrophes
Operational research:methods and applications
Throughout its history, Operational Research has evolved to include a variety of methods, models and algorithms that have been applied to a diverse and wide range of contexts. This encyclopedic article consists of two main sections: methods and applications. The first aims to summarise the up-to-date knowledge and provide an overview of the state-of-the-art methods and key developments in the various subdomains of the field. The second offers a wide-ranging list of areas where Operational Research has been applied. The article is meant to be read in a nonlinear fashion. It should be used as a point of reference or first-port-of-call for a diverse pool of readers: academics, researchers, students, and practitioners. The entries within the methods and applications sections are presented in alphabetical order
Constructive approaches to Program Induction
Search is a key technique in artificial intelligence, machine learning and Program Induction. No
matter how efficient a search procedure, there exist spaces that are too large to search effectively
and they include the search space of programs. In this dissertation we show that in the context
of logic-program induction (Inductive Logic Programming, or ILP) it is not necessary to search
for a correct program, because if one exists, there also exists a unique object that is the most
general correct program, and that can be constructed directly, without a search, in polynomial
time and from a polynomial number of examples. The existence of this unique object, that we
term the Top Program because of its maximal generality, does not so much solve the problem
of searching a large program search space, as it completely sidesteps it, thus improving the
efficiency of the learning task by orders of magnitude commensurate with the complexity of a
program space search.
The existence of a unique Top Program and the ability to construct it given finite resources
relies on the imposition, on the language of hypotheses, from which programs are constructed,
of a strong inductive bias with relevance to the learning task. In common practice, in machine
learning, Program Induction and ILP, such relevant inductive bias is selected, or created,
manually, by the human user of a learning system, with intuition or knowledge of the problem
domain, and in the form of various kinds of program templates. In this dissertation we show
that by abandoning the reliance on such extra-logical devices as program templates, and instead
defining inductive bias exclusively as First- and Higher-Order Logic formulae, it is possible to
learn inductive bias itself from examples, automatically, and efficiently, by Higher-Order Top
Program construction.
In Chapter 4 we describe the Top Program in the context of the Meta-Interpretive Learning
approach to ILP (MIL) and describe an algorithm for its construction, the Top Program
Construction algorithm (TPC). We prove the efficiency and accuracy of TPC and describe
its implementation in a new MIL system called Louise. We support theoretical results with
experiments comparing Louise to the state-of-the-art, search-based MIL system, Metagol, and
find that Louise improves Metagol’s efficiency and accuracy. In Chapter 5 we re-frame MIL as
specialisation of metarules, Second-Order clauses used as inductive bias in MIL, and prove that
problem-specific metarules can be derived by specialisation of maximally general metarules, by
MIL. We describe a sub-system of Louise, called TOIL, that learns new metarules by MIL and
demonstrate empirically that the metarules learned by TOIL match those selected manually,
while maintaining the accuracy and efficiency of learning.
iOpen Acces
Operational Research: Methods and Applications
Throughout its history, Operational Research has evolved to include a variety of methods, models and algorithms that have been applied to a diverse and wide range of contexts. This encyclopedic article consists of two main sections: methods and applications. The first aims to summarise the up-to-date knowledge and provide an overview of the state-of-the-art methods and key developments in the various subdomains of the field. The second offers a wide-ranging list of areas where Operational Research has been applied. The article is meant to be read in a nonlinear fashion. It should be used as a point of reference or first-port-of-call for a diverse pool of readers: academics, researchers, students, and practitioners. The entries within the methods and applications sections are presented in alphabetical order. The authors dedicate this paper to the 2023 Turkey/Syria earthquake victims. We sincerely hope that advances in OR will play a role towards minimising the pain and suffering caused by this and future catastrophes
Operational Research: Methods and Applications
Throughout its history, Operational Research has evolved to include a variety of methods, models and algorithms that have been applied to a diverse and wide range of contexts. This encyclopedic article consists of two main sections: methods and applications. The first aims to summarise the up-to-date knowledge and provide an overview of the state-of-the-art methods and key developments in the various subdomains of the field. The second offers a wide-ranging list of areas where Operational Research has been applied. The article is meant to be read in a nonlinear fashion. It should be used as a point of reference or first-port-of-call for a diverse pool of readers: academics, researchers, students, and practitioners. The entries within the methods and applications sections are presented in alphabetical order
Protecting Systems From Exploits Using Language-Theoretic Security
Any computer program processing input from the user or network must validate the input. Input-handling vulnerabilities occur in programs when the software component responsible for filtering malicious input---the parser---does not perform validation adequately. Consequently, parsers are among the most targeted components since they defend the rest of the program from malicious input. This thesis adopts the Language-Theoretic Security (LangSec) principle to understand what tools and research are needed to prevent exploits that target parsers. LangSec proposes specifying the syntactic structure of the input format as a formal grammar. We then build a recognizer for this formal grammar to validate any input before the rest of the program acts on it. To ensure that these recognizers represent the data format, programmers often rely on parser generators or parser combinators tools to build the parsers. This thesis propels several sub-fields in LangSec by proposing new techniques to find bugs in implementations, novel categorizations of vulnerabilities, and new parsing algorithms and tools to handle practical data formats. To this end, this thesis comprises five parts that tackle various tenets of LangSec. First, I categorize various input-handling vulnerabilities and exploits using two frameworks. First, I use the mismorphisms framework to reason about vulnerabilities. This framework helps us reason about the root causes leading to various vulnerabilities. Next, we built a categorization framework using various LangSec anti-patterns, such as parser differentials and insufficient input validation. Finally, we built a catalog of more than 30 popular vulnerabilities to demonstrate the categorization frameworks. Second, I built parsers for various Internet of Things and power grid network protocols and the iccMAX file format using parser combinator libraries. The parsers I built for power grid protocols were deployed and tested on power grid substation networks as an intrusion detection tool. The parser I built for the iccMAX file format led to several corrections and modifications to the iccMAX specifications and reference implementations. Third, I present SPARTA, a novel tool I built that generates Rust code that type checks Portable Data Format (PDF) files. The type checker I helped build strictly enforces the constraints in the PDF specification to find deviations. Our checker has contributed to at least four significant clarifications and corrections to the PDF 2.0 specification and various open-source PDF tools. In addition to our checker, we also built a practical tool, PDFFixer, to dynamically patch type errors in PDF files. Fourth, I present ParseSmith, a tool to build verified parsers for real-world data formats. Most parsing tools available for data formats are insufficient to handle practical formats or have not been verified for their correctness. I built a verified parsing tool in Dafny that builds on ideas from attribute grammars, data-dependent grammars, and parsing expression grammars to tackle various constructs commonly seen in network formats. I prove that our parsers run in linear time and always terminate for well-formed grammars. Finally, I provide the earliest systematic comparison of various data description languages (DDLs) and their parser generation tools. DDLs are used to describe and parse commonly used data formats, such as image formats. Next, I conducted an expert elicitation qualitative study to derive various metrics that I use to compare the DDLs. I also systematically compare these DDLs based on sample data descriptions available with the DDLs---checking for correctness and resilience
Contelog: A Formal Declarative Framework for Contextual Knowledge Representation and Reasoning
Context-awareness is at the core of providing timely adaptations in safety-critical secure applications of pervasive computing and Artificial Intelligence (AI) domains. In the current AI and application context-aware frameworks, the distinction between knowledge and context are blurred and not formally integrated. As a result, adaptation behaviors based on contextual reasoning cannot be formally derived and reasoned about. Also, in many smart systems such as automated manufacturing, decision making, and healthcare, it is essential for context-awareness units to synchronize with contextual reasoning modules to derive new knowledge in order to adapt, alert, and predict. A rigorous formalism is therefore essential to (1) represent contextual domain knowledge as well as application rules,
and (2) efficiently and effectively reason to draw contextual conclusions. This thesis is a contribution in this direction. The thesis introduces first a formal context representation and a context calculus used to build context models for applications. Then, it introduces query processing and optimization techniques to perform context-based reasoning. The formal framework that achieves these two tasks is called Contelog Framework, obtained by a conservative extension of the syntax and semantics of Datalog. It models contextual knowledge and infers new knowledge. In its design, contextual knowledge and contextual reasoning are loosely coupled, and hence contextual knowledge is reusable on its own. The significance is that by fixing the contextual knowledge, rules in the program and/or query may be changed. Contelog provides a theory of context, in a way that is independent of the application logic rules. The context calculus developed in this thesis allows exporting knowledge inferred in one context to be used in another context. Following the idea of Magic sets from Datalog, Magic Contexts together with query rewriting algorithms are introduced to optimize bottom-up query evaluation of Contelog programs. A Book of Examples has been compiled for Contelog, and these examples are implemented to showcase a proof of concept for the generality, expressiveness, and rigor of the proposed Contelog framework. A variety of experiments that compare the performance of Contelog with earlier Datalog implementations reveal a significant improvement and bring out practical merits of current stage of Contelog and its potential for future extensions in context representation and reasoning of emerging applications of context-aware computing
Provenance, Incremental Evaluation, and Debugging in Datalog
The Datalog programming language has recently found increasing traction in research and industry. Driven by its clean declarative semantics, along with its conciseness and ease of use, Datalog has been adopted for a wide range of important applications, such as program analysis, graph problems, and networking. To enable this adoption, modern Datalog engines have implemented advanced language features and high-performance evaluation of Datalog programs. Unfortunately, critical infrastructure and tooling to support Datalog users and developers are still missing. For example, there are only limited tools addressing the crucial debugging problem, where developers can spend up to 30% of their time finding and fixing bugs.
This thesis addresses Datalog’s tooling gaps, with the ultimate goal of improving the productivity of Datalog programmers. The first contribution is centered around the critical problem of debugging: we develop a new debugging approach that explains the execution steps taken to produce a faulty output. Crucially, our debugging method can be applied for large-scale applications without substantially sacrificing performance. The second contribution addresses the problem of incremental evaluation, which is necessary when program inputs change slightly, and results need to be recomputed. Incremental evaluation allows this recomputation to happen more efficiently, without discarding the previous results and recomputing from scratch. Finally, the last contribution provides a new incremental debugging approach that identifies the root causes of faulty outputs that occur after an incremental evaluation. Incremental debugging focuses on the relationship between input and output and can provide debugging suggestions to amend the inputs so that faults no longer occur. These techniques, in combination, form a corpus of critical infrastructure and tooling developments for Datalog, allowing developers and users to use Datalog more productively
Theory of Provocation
The present volume discusses the subject of provocation and its various applications in the field of political science. Provocation itself combines the artificial induction of events, attitudes and human behavior, and the unilateral prejudging of issues, resulting in the interlocutor being surprised, trapped, manipulated or extorted. A political provocation manifests itself in various forms: productive or parasitic; pointed, collective or networked influence; initiative or reactive and reflexive; causal, deceptive or discrediting; constructive or destructive. The author brings forth real-world examples to illuminate the various intricacies of this concept, its applications, aims, and much more
The impact of the legislative regulation of individual educator performance on the delivery of quality basic education
Thesis (LLD)--Stellenbosch University, 2022.ENGLISH ABSTRACT: The study was motivated by three factors. First, the critical importance of education
for each individual and our society as a whole. Secondly, the poor state of basic
education in South Africa. Thirdly, the central role educators play in the delivery of
quality basic education. The process of education is a means of self-actualisation and
provides individuals with the opportunity to experience their full intellectual and
emotional potential as well as the means to participate in societal processes. It is also
valuable to society as investment in education enriches the human capital of a country,
is a source of responsible adults and a driver of economic growth. For the South
African society, the most important contribution of education is that it is a vehicle for
transformation and one of the only societal equalisers that exist. Unfortunately, despite
the importance of quality education, all learners in South Africa do not have access to
education of an equal standard.
Qualified, competent, and professional educators are central to the delivery of
quality basic education. This study identifies the educator as the most important role
player in the delivery of quality basic education. The focus is on the employment of
educators in public basic education which is defined to include school education in
South Africa from grade 1 to grade 12. For purposes of the study, educator
performance was defined to include the capacity and conduct of educators in
delivering basic education. “Capacity” refers to the qualifications, competence, content
knowledge and skills of educators whereas “conduct” refers to the professionalism and
attitude of educators.
One contributing factor to the poor state of basic education is the fragmented and
otherwise inappropriate legislative regulation of educator performance in South Africa.
For this reason, the experience with misconduct and incapacity of educators within the
current legislative framework is investigated. The approach is descriptive and
analytical - both quantitative and qualitative. It includes a description of existing
research and views on the prevalence and impact of misconduct and incapacity of
educators in and on basic education in South Africa. This is followed by a statistical
overview of the extent of the application of discipline in the basic education sector
based on information from the different Provincial Departments of Education and from
arbitrations conducted by the Education Labour Relations Council. The qualitative
analysis of these arbitration awards is particularly important since each matter provides insight into the application of legal principles and the exercise of discretion
by the different role players responsible for addressing misconduct and incapacity in
basic education. Based on these insights, deficiencies in the current system of
regulation of educator performance are tabulated. This, together with comparative
insights from the English experience, is used to make specific proposals for a range of legislative amendments.AFRIKAANSE OPSOMMING: Die studie is deur drie faktore gemotiveer. Eerstens, die kritieke belang van onderwys
vir elke individu en ons samelewing as geheel. Tweedens, die swak toestand van
basiese onderwys in Suid-Afrika. Derdens, die sentrale rol wat opvoeders speel in die
lewering van gehalte basiese onderwys. Die proses van opvoeding is 'n wyse van
selfaktualisering en bied individue die geleentheid om hul volle intellektuele en
emosionele potensiaal te ervaar asook die middele om aan prosesse in die
samelewing deel te neem. Dit is ook waardevol vir die samelewing aangesien
belegging in onderwys die mensekapitaal van 'n land verryk, 'n bron is van
verantwoordelike volwassenes en 'n drywer is van ekonomiese groei. Vir die SuidAfrikaanse samelewing is die belangrikste bydrae van onderwys dat dit 'n voertuig vir
transformasie is en, as sulks, dat dit een van die enigste bestaande maatskaplike
gelykmakers is. Ongelukkig, ten spyte van die belang van gehalte onderwys, het alle
leerders in Suid-Afrika nie toegang tot onderwys van ‘n gelyke standaard nie.
Gekwalifiseerde, bekwame en professionele opvoeders is sentraal tot die lewering
van 'n gehalte basiese onderwys. Hierdie studie identifiseer dus die opvoeder as die
belangrikste rolspeler in die lewering van gehalte basiese onderwys. Die fokus is op
die indiensneming van opvoeders in openbare basiese onderwys wat gedefinieër is
om skoolonderwys in Suid-Afrika van graad 1 tot graad 12 in te sluit. Vir doeleindes
van die studie word opvoederprestasie gedefinieër om die bekwaamheid en gedrag
van opvoeders in die lewering van basiese onderwys in te sluit. “Bekwaamheid”
verwys na die kwalifikasies, bekwaamheid, inhoudskennis en vaardighede van
opvoeders terwyl “gedrag” verwys na die professionaliteit en houding van opvoeders.
Een bydraende faktor tot die swak toestand van basiese onderwys in Suid-Afrika is
die gefragmenteerde en andersins onvanpaste wetgewende regulering van
opvoederprestasie. Om hierdie rede word die ervaring van wangedrag en
onbekwaamheid van opvoeders binne die huidige wetgewende raamwerk ondersoek.
Die benadering is beskrywend en analities – beide kwantitatief en kwalitatief. Dit sluit
'n beskrywing in van bestaande navorsing en sienings oor die voorkoms en impak van
wangedrag en onbekwaamheid van opvoeders in en op basiese onderwys in SuidAfrika. Dit word gevolg deur 'n statistiese oorsig van die omvang van die toepassing
van dissipline in die basiese onderwyssektor gebaseer op inligting van die verskillende
Provinsiale Onderwysdepartemente en uit arbitrasies wat deur die Bedingingsraad vir Onderwys aangehoor is. Die kwalitatiewe ontleding van hierdie arbitrasietoekennings
is veral belangrik aangesien elke aangeleentheid insig bied in die toepassing van
regsbeginsels en die uitoefening van diskresie deur die verskillende rolspelers wat
verantwoordelik is daarvoor om wangedrag en onbekwaamheid in basiese onderwys
aan te spreek. Gebaseer op hierdie insigte word tekortkominge in die huidige sisteem
van regulering van opvoederprestasie getabelleer. Dit, tesame met vergelykende
insigte uit die Engelse ervaring, word gebruik om spesifieke voorstelle vir 'n reeks wetswysigings te maak.Doctora
- …