24 research outputs found

    Life history and physiological ecology of the lizard, Cordylus Giganteus

    Get PDF
    Bibliography: pages 243-268.Cordylus giganteus is a large, terrestrial, viviparous lizard, endemic to the Highveld grasslands of South Africa. Its distribution is limited and its conservation status is vulnerable. Autopsy and mark-recapture methods were used to study the seasonal aspects of its reproductive cycle, diet, energy reserves, growth, population dynamics, daily activity and thermoregulation. Reproduction is distinctly seasonal in both sexes. Females may reproduce biennially. Vitellogenesis commenced in autumn (March), and continued through hibernation with ovulation in spring (October). Two or three young are born in autumn. A functional placenta is implicated. Seasonal steroid hormone profiles are presented. Males exhibit a postnuptial spermatogenetic cycle. Spermatogenesis commences in spring with peak spermiogenesis in autumn and testicular regression following in late autumn. Spermatozoa are stored in the epididymis and ductus deferens for seven to eight months. A bimodal plasma testosterone profile is reported, consistent with spermiogenesis in autumn and mating behaviour in spring. C. giganteus feeds during 8 months of the year and prefer Coleoptera as prey. Fat bodies are utilized for winter maintenance and reproduction. Hatchlings grow 20-30mm during the first year and maximum growth rates occur in summer. Males and females attain sexual maturity at about 165 mm SVL in the fourth year. Seasonal affects on growth rate resulted in poor fit by either logistic-by-length or von Bertalanffy models and a seasonal oscillating model was introduced. Adult males are smaller than females; head sizes are the same but allometric slopes differed significantly. Population size and structure remained stable in the study area. Densities ranged from 9 - 11 lizards/ha. The age structure is marked by the low relative abundance of juveniles. Survivorship during the first year varied among years. Mortality was highest during summer months rather than winter months. Average annual survival of adults 'was high, but varied with sex and years (ranged from 58%-80%). A life table yielded a net reproductive rate (Ro= 1) sufficient to sustain the population, if the reproductive life of an adult female is at least 12 years. Lizards remain in their burrows during winter. In summer, activity was bimodal on sunshine days but unimodal on cool overcast days. Body temperature is regulated by behavioural means (postural and orientation changes) and by shuttling to the cool burrow microclimate. The life history strategy corresponds partially to that of K-selection

    Execution/Simulation of Context/Constraint-aware Composite Services using GIPSY

    Get PDF
    For fulfilling a complex requirement comprising of several sub-tasks, a composition of simple web services, each of which is dedicated to performing a specific sub-task involved, proves to be a more competent solution in comparison to an equivalent atomic web service. Owing to advantages such as re-usability of components, broader options for composition requesters and liberty to specialize for component providers, for over two decades now, composite services have been extensively researched to the point of being perfected in many aspects. Yet, most of the studies undertaken in this field fail to acknowledge that every web service has a limited context in which it can successfully perform its tasks, the boundaries of which are defined by the internal constraints placed on the service by its providers. When used as part of a composition, the restricted context-spaces of all such component services together define the contextual boundaries of the composite service as a unit, which makes internal constraints an influential factor for composite service functionality. However, due to the limited exposure received by them, no systems have yet been proposed to cater to the specific verification of internal constraints imposed on components of a composite service. In an attempt to address this gap in service composition research, in this thesis, we propose a multi-faceted solution capable of not only automatically constructing context-aware composite web services with their internal constraints positioned for optimum resource-utilization but also of validating the generated compositions using the General Intensional Programming SYstem (GIPSY) as a time- and cost-efficient simulation/execution environment

    A new dialect of SOFL-Syntax formal semantics and tool support

    Get PDF
    Structured Object Orientated Formal Language (SOFL) is a formal method design methodology that combines data flows diagrams and predicates in order to describe processes that can be refined. This methodology creates a very versatile method of describing a system, which system properties can be proven rigorously. Data flows are grouped by ports that define from which data flows data can be consumed or on which flows data can be generated. For predicates, Logic of Partial Functions (LFP) are used; and an undefined element that is also used to indicate if a data flows do not contain any data. Over time SOFL “evolved organically” and a number of features were added: usability was the main consideration for a feature being added. For a formal language to be useful there must be no uncertainty of a specific design’s meaning. With SOFL, there is a possible contradiction between the requirement that a process's precondition must be true when the process fire, and the fire rules. This contradiction is due to the use of LPF. Semantics (the meaning) of SOFL was not always updated to keep track of the changes made to SOFL which resulted in an outdated and incomplete semantic. The incompleteness of the semantics is a significant factor motivating the work done in this dissertation. In this dissertation, a dialect of SOFL is created to define a semantic. Not all the elements of SOFL are added in order that a simpler semantic can be defined. Elements that were removed include: LPF, Classes, and Non-deterministic broadcast nodes. Semantics of the dialect is created by a two-step process: firstly, an intuitive understanding of the dialect is created, and secondly, both static and dynamic semantics are defined by means of translations. A translation is a mapping from the dialect to a formal language that describes a certain aspect of the dialect. Static semantics defines the meaning of the elements that are “fixed” in their state: SMT-LIB is used as the target language to describe the static semantics of the dialect. Dynamic semantics describes how an element in a design changes over time: the process algebra mCRL2 is used as the formal language which describes the dynamic behaviour of the dialect. The SMT-Solver Z3 and tools included in mCLR2 are used to analyse the translation of the dialect. Use of these tools allows properties that are necessary for a design to have a well defined meaning, to be proven. Properties that can be proven include: a process can fire, a process can fire an infinite number of times, and a predicate that described a property. An Eclipse plug-in is created so that translation is not required to be done manually. After a design is translated the tools Z3 and mCRL2 are run using script files and the results of the analysis are displayed on the screen. The desired properties could be proven but for a moderate size design, but as the size of the design increased the analysis of the translation could not be completed due to computational problem. Usability of the tool can be improved by not only using a textual representation of a design, but also visual representations as in SOFL. As a result, properties that are necessary for a design to have a well-defined meaning, can be proven using these tools.Dissertation (MSc)--University of Pretoria, 2018.Computer ScienceMScUnrestricte

    A Formal Engineering Approach for Interweaving Functional and Security Requirements of RESTful Web APIs

    Get PDF
    RESTful Web API adoption has become ubiquitous with the proliferation of REST APIs in almost all domains with modern web applications embracing the micro-service architecture. This vibrant and expanding adoption of APIs, has made an increasing amount of data to be funneled through systems which require proper access management to ensure that web assets are secured. A RESTful API provides data using the HTTP protocol over the network, interacting with databases and other services and must preserve its security properties. Currently, practitioners are facing two major challenges for developing high quality secure RESTful APIs. One, REST is not a protocol. Instead, it is a set of guidelines that define how web resources can be designed and accessed over HTTP endpoints. There are a set of guidelines which stipulate how related resources should be structured using hierarchical URIs as well as how specific well-defined actions on those resources should be represented using different HTTP verbs. Whereas security has always been critical in the design of RESTful APIs, there are no clear formal models utilizing a secure-by-design approach that interweaves both the functional and security requirements. The other challenge is how to effectively utilize a model driven approach for constructing precise requirements and design specifications so that the security of a RESTFul API is considered as a concern that transcends across functionality rather than individual isolated operations.This thesis proposes a novel technique that encourages a model driven approach to specifying and verifying APIs functional and security requirements with the practical formal method SOFL (Structured-Object-Oriented Formal Language). Our proposed approach provides a generic 6 step model driven approach for designing security aware APIs by utilizing concepts of domain models, domain primitives, Ecore metamodel and SOFL. The first step involves generating a flat file with APIs resource listings. In this step, we extract resource definitions from an input RESTful API documentation written in RAML using an existing RAML parser. The output of this step is a flat file representing API resources as defined in the RAML input file. This step is fully automated. The second step involves automatic construction of an API resource graph that will work as a blue print for creating the target API domain model. The input for this step is the flat file generated from step 1 and the output is a directed graph (digraph) of API resource. We leverage on an algorithm which we created that takes a list of lists of API resource nodes and the defined API root resource node as an input, and constructs a digraph highlighting all the API resources as an output. In step 3, we use the generated digraph as a guide to manually define the API’s initial domain model as the target output with an aggregate root corresponding to the root node of the input digraph and the rest of the nodes corresponding to domain model entities. In actual sense, the generated digraph in step 2 is a barebone representation of the target domain model, but what is missing in the domain model at this stage in the distinction between containment and reference relationship between entities. The resulting domain model describes the entire ecosystem of the modeled API in the form of Domain Driven Design Concepts of aggregates, aggregate root, entities, entity relationships, value objects and aggregate boundaries. The fourth step, which takes our newly defined domain model as input, involves a threat modeling process using Attack Defense Trees (ADTrees) to identify potential security vulnerabilities in our API domain model and their countermeasures. aCountermeasures that can enforce secure constructs on the attributes and behavior of their associated domain entities are modeled as domain primitives. Domain primitives are distilled versions of value objects with proper invariants. These invariants enforce security constraints on the behavior of their associated entities in our API domain model. The output of this step is a complete refined domain model with additional security invariants from the threat modeling process defined as domain primitives in the refined domain model. This fourth step achieves our first interweaving of functional and security requirements in an implicit manner. The fifth step involves creating an Ecore metamodel that describes the structure of our API domain model. In this step, we rely on the refined domain model as input and create an Ecore metamodel that our refined domain model corresponds to, as an output. Specifically, this step encompasses structural modeling of our target RESTful API. The structural model describes the possible resource types, their attributes, and relations as well as their interface and representations. The sixth and the final step involves behavioral modeling. The input for this step is an Ecore metamodel from step 5 and the output is formal security aware RESTful API specifications in SOFL language. Our goal here is to define RESTful API behaviors that consist of actions corresponding to their respective HTTP verbs i.e., GET, POST, PUT, DELETE and PATCH. For example, CreateAction creates a new resource, an UpdateAction provides the capability to change the value of attributes and ReturnAction allows for response definition including the Representation and all metadata. To achieve behavioral modelling, we transform our API methods into SOFL processes. We take advantage of the expressive nature of SOFL processes to define our modeled API behaviors. We achieve the interweaving of functional and security requirements by injecting boolean formulas in post condition of SOFL processes. To verify whether the interweaved functional and security requirements implement all expected functions correctly and satisfy the desired security constraints, we can optionally perform specification testing. Since implicit specifications do not indicate algorithms for implementation but are rather expressed with predicate expressions involving pre and post conditions for any given specification, we can substitute all the variables involved a process with concrete values of their types with results and evaluate their results in the form of truth values true or false. When conducting specification testing, we apply SOFL process animation technique to obtain the set of concrete values of output variables for each process functional scenario. We analyse test results by comparing the evaluation results with an analysis criteria. An analysis criteria is a predicate expression representing the properties to be verified. If the evaluation results are consistent with the predicate expression, the analysis show consistency between the process specification and its associated requirement. We generate the test cases for both input and output variables based on the user requirements. The test cases generated are usually based on test targets which are predicate expressions, such as the pre and post conditions of a process. when testing for conformance of a process specification to its associated service operation, we only need to observe the execution results of the process by providing concrete input values to all of its functional scenarios and analyze their defining conditions relative to user requirements. We present an empirical case study for validating the practicality and usability of our model driven formal engineering approach by applying it in developing a Salon Booking System. A total of 32 services covering functionalities provided by the Salon Booking System API were developed. We defined process specifications for the API services with their respective security requirements. The security requirements were injected in the threat modeling and behavioral modeling phase of our approach. We test for the interweaving of functional and security requirements in the specifications generated by our approach by conducting tests relative to original RAML specifications. Failed tests were exhibited in cases where injected security measure like requirement of an object level access control is not respected i.e., object level access control is not checked. Our generated SOFL specification correctly rejects such case by returning an appropriate error message while the original RAML specification incorrectly dictates to accept such request, because it is not aware of such measure. We further demonstrate a technique for generating SOFL specifications from a domain model via model to text transformation. The model to text transformation technique semi-automates the generation of SOFL formal specification in step 6 of our proposed approach. The technique allows for isolation of dynamic and static sections of the generated specifications. This enables our technique to have the capability of preserving the static sections of the target specifications while updating the dynamic sections in response to the changes of the underlying domain model representing the RESTful API in design. Specifically, our contribution is provision of a systemic model driven formal engineering approach for design and development of secure RESTful web APIs. The proposed approach offers a six-step methodology covering both structural and behavioral modelling of APIs with a focus on security. The most distinguished merit of the model to text transformation is the utilization of the API’s domain model as well as a metamodel that the domain model corresponds to as the foundation for generation of formal SOFL specifications that is a representation of API’s functional and security requirements.ćšćŁ«(理歩)æł•æ”żć€§ć­Š (Hosei University

    Formal Foundations for Information-Preserving Model Synchronization Processes Based on Triple Graph Grammars

    Get PDF
    Zwischen verschiedenen Artefakten, die Informationen teilen, wieder Konsistenz herzustellen, nachdem eines von ihnen geĂ€ndert wurde, ist ein wichtiges Problem, das in verschiedenen Bereichen der Informatik auftaucht. Mit dieser Dissertation legen wir eine Lösung fĂŒr das grundlegende Modellsynchronisationsproblem vor. Bei diesem Problem ist ein Paar solcher Artefakte (Modelle) gegeben, von denen eines geĂ€ndert wurde; Aufgabe ist die Wiederherstellung der Konsistenz. Tripelgraphgrammatiken (TGGs) sind ein etablierter und geeigneter Formalismus, um dieses und verwandte Probleme anzugehen. Da sie auf der algebraischen Theorie der Graphtransformation und dem (Double-)Pushout Zugang zu Ersetzungssystemen basieren, sind sie besonders geeignet, um Lösungen zu entwickeln, deren Eigenschaften formal bewiesen werden können. Doch obwohl TGG-basierte AnsĂ€tze etabliert sind, leiden viele von ihnen unter dem Problem des Informationsverlustes. Wenn ein Modell geĂ€ndert wurde, können wĂ€hrend eines Synchronisationsprozesses Informationen verloren gehen, die nur im zweiten Modell vorliegen. Das liegt daran, dass solche Synchronisationsprozesse darauf zurĂŒckfallen Konsistenz dadurch wiederherzustellen, dass sie das geĂ€nderte Modell (bzw. große Teile von ihm) neu ĂŒbersetzen. Wir schlagen einen TGG-basierten Ansatz vor, der fortgeschrittene Features von TGGs unterstĂŒtzt (Attribute und negative Constraints), durchgĂ€ngig formalisiert ist, implementiert und inkrementell in dem Sinne ist, dass er den Informationsverlust im Vergleich mit vorherigen AnsĂ€tzen drastisch reduziert. Bisher gibt es keinen TGG-basierten Ansatz mit vergleichbaren Eigenschaften. Zentraler Beitrag dieser Dissertation ist es, diesen Ansatz formal auszuarbeiten und seine wesentlichen Eigenschaften, nĂ€mlich Korrektheit, VollstĂ€ndigkeit und Termination, zu beweisen. Die entscheidende neue Idee unseres Ansatzes ist es, Reparaturregeln anzuwenden. Dies sind spezielle Regeln, die es erlauben, Änderungen an einem Modell direkt zu propagieren anstatt auf NeuĂŒbersetzung zurĂŒckzugreifen. Um diese Reparaturregeln erstellen und anwenden zu können, entwickeln wir grundlegende BeitrĂ€ge zur Theorie der algebraischen Graphtransformation. ZunĂ€chst entwickeln wir eine neue Art der sequentiellen Komposition von Regeln. Im Gegensatz zur gewöhnlichen Komposition, die zu Regeln fĂŒhrt, die Elemente löschen und dann wieder neu erzeugen, können wir Regeln herleiten, die solche Elemente stattdessen bewahren. Technisch gesehen findet der Synchronisationsprozess, den wir entwickeln, außerdem in der Kategorie der partiellen Tripelgraphen statt und nicht in der der normalen Tripelgraphen. Daher mĂŒssen wir sicherstellen, dass die fĂŒr Double-Pushout-Ersetzungssysteme ausgearbeitete Theorie immer noch gĂŒltig ist. Dazu entwickeln wir eine (kategorientheoretische) Konstruktion neuer Kategorien aus gegebenen und zeigen, dass (i) diese Konstruktion die Axiome erhĂ€lt, die nötig sind, um die Theorie fĂŒr Double-Pushout-Ersetzungssysteme zu entwickeln, und (ii) partielle Tripelgraphen als eine solche Kategorie konstruiert werden können. Zusammen ermöglichen diese beiden grundsĂ€tzlichen BeitrĂ€ge es uns, unsere Lösung fĂŒr das grundlegende Modellsynchronisationsproblem vollstĂ€ndig formal auszuarbeiten und ihre zentralen Eigenschaften zu beweisen.Restoring consistency between different information-sharing artifacts after one of them has been changed is an important problem that arises in several areas of computer science. In this thesis, we provide a solution to the basic model synchronization problem. There, a pair of such artifacts (models), one of which has been changed, is given and consistency shall be restored. Triple graph grammars (TGGs) are an established and suitable formalism to address this and related problems. Being based on the algebraic theory of graph transformation and (double-)pushout rewriting, they are especially suited to develop solutions whose properties can be formally proven. Despite being established, many TGG-based solutions do not satisfactorily deal with the problem of information loss. When one model is changed, in the process of restoring consistency such solutions may lose information that is only present in the second model because the synchronization process resorts to restoring consistency by re-translating (large parts of) the updated model. We introduce a TGG-based approach that supports advanced features of TGGs (attributes and negative constraints), is comprehensively formalized, implemented, and is incremental in the sense that it drastically reduces the amount of information loss compared to former approaches. Up to now, a TGG-based approach with these characteristics is not available. The central contribution of this thesis is to formally develop that approach and to prove its essential properties, namely correctness, completeness, and termination. The crucial new idea in our approach is the use of repair rules, which are special rules that allow one to directly propagate changes from one model to the other instead of resorting to re-translation. To be able to construct and apply these repair rules, we contribute more fundamentally to the theory of algebraic graph transformation. First, we develop a new kind of sequential rule composition. Whereas the conventional composition of rules leads to rules that delete and re-create elements, we can compute rules that preserve such elements instead. Furthermore, technically the setting in which the synchronization process we develop takes place is the category of partial triple graphs and not the one of ordinary triple graphs. Hence, we have to ensure that the elaborate theory of double-pushout rewriting still applies. Therefore, we develop a (category-theoretic) construction of new categories from given ones and show that (i) this construction preserves the axioms that are necessary to develop the theory of double-pushout rewriting and (ii) partial triple graphs can be constructed as such a category. Together, those two more fundamental contributions enable us to develop our solution to the basic model synchronization problem in a fully formal manner and to prove its central properties

    Mathematics in Software Reliability and Quality Assurance

    Get PDF
    This monograph concerns the mathematical aspects of software reliability and quality assurance and consists of 11 technical papers in this emerging area. Included are the latest research results related to formal methods and design, automatic software testing, software verification and validation, coalgebra theory, automata theory, hybrid system and software reliability modeling and assessment

    Modernisation in Algeria and the quest of technology : A sociological analysis of secondary education.

    Get PDF
    This study is a social analysis of the historical conditions of the emergence of the Algerian intellectual elite. It attempts to trace and document the cultural components of two distinct educational systems, namely, the colonial and traditional schooling systems that existed in colonial Algeria. Class analysis is traced in the light of cultural and sociological factors in order to reveal the cultural revival that has long been neglected when studied from a marxist perspective. The method of analysis can be described as historical, analytical and interpretive. Sources of data are largely original English and French works, sometimes translations from Arabic and governmental documents. Further, the survey consists of a case study in an urban area, as well as interviews with education officials at the Ministry. This study concludes with a theoretical analysis of the process of the Algerian model of modernisation and the numerous policies of investment in human capital so as to meet, the needs of the country in terms of manpower. In addition it attempts to reveal the structure of the intellectual stratum through the new system of education introduced in 1976 through Fundamental school (1) as well as the social origin of students and the factors affecting their careers in the different branches of studies

    VERIFICATION AND APPLICATION OF DETECTABILITY BASED ON PETRI NETS

    Get PDF
    In many real-world systems, due to limitations of sensors or constraints of the environment, the system dynamics is usually not perfectly known. However, the state information of the system is usually crucial for the purpose of decision making. The state of the system needs to be determined in many applications. Due to its importance, the state estimation problem has received considerable attention in the discrete event system (DES) community. Recently, the state estimation problem has been studied systematically in the framework of detectability. The detectability properties characterize the possibility to determine the current and the subsequent states of a system after the observation of a finite number of events generated by the system. To model and analyze practical systems, powerful DES models are needed to describe the different observation behaviors of the system. Secondly, due to the state explosion problem, analysis methods that rely on exhaustively enumerating all possible states are not applicable for practical systems. It is necessary to develop more efficient and achievable verification methods for detectability. Furthermore, in this thesis, efficient detectability verification methods using Petri nets are investigated, then detectability is extended to a more general definition (C-detectability) that only requires that a given set of crucial states can be distinguished from other states. Formal definitions and efficient verification methods for C-detectability properties are proposed. Finally, C-detectability is applied to the railway signal system to verify the feasibility of this property: 1. Four types of detectability are extended from finite automata to labeled Petri nets. In particular, strong detectability, weak detectability, periodically strong detectability, and periodically weak detectability are formally defined in labeled Petri nets. 2. Based on the notion of basis reachability graph (BRG), a practically efficient approach (the BRG-observer method) to verify the four detectability properties in bounded labeled Petri nets is proposed. Using basis markings, there is no need to enumerate all the markings that are consistent with an observation. It has been shown by other researchers that the size of the BRG is usually much smaller than the size of the reachability graph (RG). Thus, the method improves the analysis efficiency and avoids the state space explosion problem. 3. Three novel approaches for the verification of the strong detectability and periodically strong detectability are proposed, which use three different structures whose construction has a polynomial complexity. Moreover, rather than computing all cycles of the structure at hand, which is NP-hard, it is shown that strong detectability can be verified looking at the strongly connected components whose computation also has a polynomial complexity. As a result, they have lower computational complexity than other methods in the literature. 4. Detectability could be too restrictive in real applications. Thus, detectability is extended to C-detectability that only requires that a given set of crucial states can be distinguished from other states. Four types of C-detectability are defined in the framework of labeled Petri nets. Moreover, efficient approaches are proposed to verify such properties in the case of bounded labeled Petri net systems based on the BRG. 5. Finally, a general modeling framework of railway systems is presented for the states estimation using labeled Petri nets. Then, C-detectability is applied to railway signal systems to verify its feasibility in the real-world system. Taking the RBC handover procedure in the Chinese train control system level 3 (CTCS-3) as an example, the RBC handover procedure is modeled using labeled Petri nets. Then based on the proposed approaches, it is shown that that the RBC handover procedure satisfies strongly C-detectability
    corecore