8 research outputs found
Refactoring the Whitby Intelligent Tutoring System for Clean Architecture
Whitby is the server-side of an Intelligent Tutoring System application for learning System-Theoretic Process Analysis (STPA), a methodology used to ensure the safety of anything that can be represented with a systems model. The underlying logic driving the reasoning behind Whitby is Situation Calculus, which is a many-sorted logic with situation, action, and object sorts. The Situation Calculus is applied to Ontology Authoring and Contingent Scaffolding: the primary activities within Whitby. Thus many fluents and actions are aggregated in Whitby from these two sub-applications and from Whitby itself, but all are available through a common situation query interface that does not depend upon any of the fluents or actions. Each STPA project in Whitby is a single situation term, which is queried for fluents that include the ontology, and to determine what pedagogical interventions to offer. Initially Whitby was written in Prolog using a module system. In the interest of a cleaner architecture and implementation with improved code reuse and extensibility, the initial application was refactored into Logtalk. This refactoring includes decoupling the Situation Calculus reasoner, Ontology Authoring framework, and Contingent Scaffolding framework into third-party libraries that can be reused in other applications. This extraction was achieved by inverting dependencies via Logtalk protocols and categories, which are reusable interfaces and components that provide functionally cohesive sets of predicate declarations and predicate definitions. In this paper the architectures of two iterations of Whitby are evaluated with respect to the motivations behind the refactor: clean architecture enabling code reuse and extensibility
Ontology-Supported Scaffolding for System Safety Analysis
System Safety Analysis is a valuable task used when trying to ensure that any thing
that can be represented with the systems-model does not behave in some manner
that is undesirable to the stakeholders in that system. It's a creative task,
with no known correct solution, with limited tool support. This thesis
investigates the possibility of providing support to analysts undertaking this
task through the use of ontology and pedagogy in an artificially intelligent tool.
An ontology to capture the system-model as understood by System-Theoretic
Accident Model and Processes (STAMP) was authored, building on an existing
set-theoretic representation. This required the authoring of underlying
ontology-modules, including one for Control Systems and one to capture
sufficient information for use with Situation Calculus. Together these capture
information to be used in reasoning about system behaviour. During System Safety
Analysis a user extends this ontology to model their system, and the intelligent
support tool interprets it to offer its advice.
The intelligent support tool uses Contingent Scaffolding to tailor its support
to the user, this pedagogical strategy was chosen as it's been shown to enable
the learner to produce a better quality product than they would be capable of
alone. Contingent Scaffolding depends upon knowledge of past behaviour of the
learner so that interventions can be pitched at the correct level for the
learner. Typically ontology authoring tools use a synchronic view of the
ontology, and so don't capture the required history. This tool uses
Situation Calculus to capture a diachronic view of the ontology such that the
history of authorship can be reasoned with to apply the Contingent Scaffolding
framework defined herein.
To evaluate the practicability of this approach the ontology and scaffolding
were implemented in software. This surfaced an issue with the inability to
inverse dependencies in Prolog, which was important to make the tools reuseable
and shareable. These were overcome by Protocols provided in Logtalk. The code
was then applied to other domains, such as robotics planning by a third-party,
demonstrating generalisability of the intelligent support tool.
A user study was conducted to evaluate the effectiveness of the intelligent
support tool, in which novices undertook a System Safety Analysis. The tool was
able to effectively provide support where definitions were missed and additional
patterns of behaviour were identified that are indicitive of the user needing
support.
The thesis makes a number of contributions including: a systems ontology with a
focus on capturing hypothetical and realised behaviour, a formal definition of a
contingent scaffolding framework that can be used with ill-defined tasks, and
the use of dependency inversion in Prolog to enable sharing of libraries. The
primary contribution is in the use of a diachronic view of ontology authoring to
provide support, which has been successfully exploited and has scope for
providing a platform for many more applications
Extending the Exposure Score of Web Browsers by Incorporating CVSS
When browsing the Internet, HTTP headers enable both clients and servers send extra data in their requests or responses such as the User-Agent string. This string contains information related to the senderâs device, browser, and operating system. Yet its content differs from one browser to another. Despite the privacy and security risks of User-Agent strings, very few works have tackled this problem. Our previous work proposed giving Internet browsers exposure relative scores to aid users to choose less intrusive ones. Thus, the objective of this work is to extend our previous work through: first, conducting a user study to identify its limitations. Second, extending the exposure score via incorporating data from the NVD. Third, providing a full implementation, instead of a limited prototype. The proposed system: assigns scores to usersâ browsers upon visiting our website. It also suggests alternative safe browsers, and finally it allows updating the back-end database with a click of a button. We applied our method to a data set of more than 52 thousand unique browsers. Our performance and validation analysis show that our solution is accurate and efficient. The source code and data set are publicly available here [4].</p
Object-oriented data mining
EThOS - Electronic Theses Online ServiceGBUnited Kingdo
Recommended from our members
Shape theory and mathematical design of a general geometric kernel through regular stratified objects
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.This dissertation focuses on the mathematical design of a unified shape kernel for geometric computing, with possible applications to computer aided design (CAM) and manufacturing (CAM), solid geometric modelling, free-form modelling of curves and surfaces, feature-based modelling, finite element meshing, computer animation, etc.
The generality of such a unified shape kernel grounds on a shape theory for objects in some Euclidean space. Shape does not mean herein only geometry as usual in geometric modelling, but has been extended to other contexts, e. g. topology, homotopy, convexity theory, etc. This shape theory has enabled to make a shape analysis of the current geometric kernels. Significant deficiencies have been then identified in how these geometric kernels represent shapes from different applications.
This thesis concludes that it is possible to construct a general shape kernel capable of representing and manipulating general specifications of shape for objects even in higher-dimensional Euclidean spaces, regardless whether such objects are implicitly or parametrically defined, they have âincomplete boundariesâ or not, they are structured with more or less detail or subcomplexes, which design sequence has been followed in a modelling session, etc. For this end, the basic constituents of such a general geometric kernel, say a combinatorial data structure and respective Euler operators for n-dimensional regular stratified objects, have been introduced and discussed
Flexible Coinduction
openRecursive definitions of predicates by means of inference rules are ubiquitous in computer science. They are usually interpreted inductively or coinductively, however there are situations where none of these two options provides the expected meaning. In the thesis we propose a flexible form of coinductive interpretation, based on the notion of corules, able to deal with such situations.
In the first part, we define such flexible coinductive interpretation as a fixed point of the standard inference operator lying between the least and the greatest one, and we provide several equivalent proof-theoretic semantics, combining well-founded and non-well-founded derivations. This flexible interpretation nicely subsumes standard inductive and coinductive ones and is naturally associated with a proof principle, which smoothly extends the usual coinduction principle.
In the second part, we focus on the problem of modelling infinite behaviour by a big-step operational semantics, which is a paradigmatic example where neither induction nor coinduction provide the desired interpretation. In order to be independent from specific examples, we provide a general, but simple, definition of what a big-step semantics is. Then, we extend it to include also observations, describing the interaction with the environment, thus providing a richer description of the behaviour of programs. In both settings, we show how corules can be successfully adopted to model infinite behaviour, by providing a construction extending a big-step semantics, which as usual only describes finite computations, to a richer one including infinite computations as well. Finally, relying on these constructions, we provide a proof technique to show soundness of a predicate with respect to a big-step semantics.
In the third part, we ez face eez the problem of providing an algorithmic support to corules. To this end, we consider the restriction of the flexible coinductive interpretation to regular derivations, analysing again both proof-theoretic and fixed point semantics and developing proof techniques. Furthermore, we show that this flexible regular interpretation can be equivalently characterised inductively by a cycle detection mechanism, thus obtaining a sound and complete (abstract) (semi-)algorithm to check whether a judgement is derivable. Finally, we apply such results to extend logic programming by coclauses, the analogous of corules, defining declarative and operational semantics and proving ez that eez the latter is sound and complete with respect to the regular declarative model, thus obtaining a concrete support to flexible coinduction.openXXXIII CICLO - INFORMATICA E INGEGNERIA DEI SISTEMI/ COMPUTER SCIENCE AND SYSTEMS ENGINEERING - Informatica/computer scienceDagnino, Francesc