83 research outputs found
SAGA: A project to automate the management of software production systems
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management
Recommended from our members
Tactics From Proofs
Proof guarantees the correctness of a formal specification with respect to formal requirements, and of an implementation with respect to a specification, and so provides valuable verification methods in high integrity system development. However, proof development by hand tends to be an erudite, error-prone and seemingly interminable task.
Tactics are programs that drive theorem-provers, thus automating proof development and alleviating some of the problems mentioned above. The development of tactics for a particular application domain also extends the domain of application of the theorem-prover. A LCF-tactic is safe in that if it fails to be applicable to a particular conjecture, then it will not produce an incorrect proof.
The current construction of tactics from proofs does not yield sufficiently robust tactics. Proofs tend to be specific to the details of a specification and so are not reusable in general, e.g. the same proof may not work when the definition of a conjecture is changed. The major challenges in proof development are deciding which proof rule and instantiations to apply in order to prove a conjecture.
Discerning patterns in formal interactive proof development facilitates the construction of robust tactics that can withstand definitional changes in conjectures. Having developed an interactive proof for a conjecture, we develop the necessary abstractions of the proof steps used, to construct a tactic th at can be applicable to other conjectures in that domain. By so doing we encode human expertise used in the proof development, and make proofs robust and thus generally reusable.
We apply our theory on the proofs of conjectures involving some set theory operators, and on the proof obligations that arise in the formal development of numerical specifications using the retrenchment method under the IEEE-854 floating-point standard in the PVS theorem-prover/proof-checker
Sample Path Analysis of Integrate-and-Fire Neurons
Computational neuroscience is concerned with answering two intertwined questions that are based on the assumption that spatio-temporal patterns of spikes form the universal language of the nervous system. First, what function does a specific neural circuitry perform in the elaboration of a behavior? Second, how do neural circuits process behaviorally-relevant information? Non-linear system analysis has proven instrumental in understanding the coding strategies of early neural processing in various sensory modalities. Yet, at higher levels of integration, it fails to help in deciphering the response of assemblies of neurons to complex naturalistic stimuli. If neural activity can be assumed to be primarily driven by the stimulus at early stages of processing, the intrinsic activity of neural circuits interacts with their high-dimensional input to transform it in a stochastic non-linear fashion at the cortical level. As a consequence, any attempt to fully understand the brain through a system analysis approach becomes illusory. However, it is increasingly advocated that neural noise plays a constructive role in neural processing, facilitating information transmission. This prompts to gain insight into the neural code by studying the stochasticity of neuronal activity, which is viewed as biologically relevant. Such an endeavor requires the design of guiding theoretical principles to assess the potential benefits of neural noise. In this context, meeting the requirements of biological relevance and computational tractability, while providing a stochastic description of neural activity, prescribes the adoption of the integrate-and-fire model. In this thesis, founding ourselves on the path-wise description of neuronal activity, we propose to further the stochastic analysis of the integrate-and fire model through a combination of numerical and theoretical techniques. To begin, we expand upon the path-wise construction of linear diffusions, which offers a natural setting to describe leaky integrate-and-fire neurons, as inhomogeneous Markov chains. Based on the theoretical analysis of the first-passage problem, we then explore the interplay between the internal neuronal noise and the statistics of injected perturbations at the single unit level, and examine its implications on the neural coding. At the population level, we also develop an exact event-driven implementation of a Markov network of perfect integrate-and-fire neurons with both time delayed instantaneous interactions and arbitrary topology. We hope our approach will provide new paradigms to understand how sensory inputs perturb neural intrinsic activity and accomplish the goal of developing a new technique for identifying relevant patterns of population activity. From a perturbative perspective, our study shows how injecting frozen noise in different flavors can help characterize internal neuronal noise, which is presumably functionally relevant to information processing. From a simulation perspective, our event-driven framework is amenable to scrutinize the stochastic behavior of simple recurrent motifs as well as temporal dynamics of large scale networks under spike-timing-dependent plasticity
Recommended from our members
Re-visioning the peer conference : critical language awareness and writing with eighth graders.
This dissertation reports findings from a sociolinguistic ethnographic study that examined relationships between a critical language awareness, peer conferencing, and student writing. The purpose of the study was to use critical language study to develop student understanding of the social, cultural and political aspects of language, thereby promoting democratic classrooms. The study involved the revision of the traditional peer conferencing format to include consideration of the social, cultural, and political aspects of language and power. This pedagogical change was embedded in a critical language awareness curriculum and in a Native American unit of study, and involved eighth graders at a suburban middle school who represented a variety of cultures, ethnicities, socio-economic classes, and abilities. They wrote response papers and stories focused on Native American topics and conferred with their partners regarding the social, cultural, and political aspects of language and power in the representation of Native Americans in their stories and response papers. Students recorded their conference responses on the peer conference sheets, and wrote final drafts of their stories and response papers. Analysis of 20 peer conferences involved thematic and critical discourse microanalysis of student talk and critical discourse microanalysis of student final drafts and revisions of their writing. The critical discourse microanalysis, was based on Fairclough\u27s (1992) approach to discourse analysis. The evidence demonstrates that when students became critical language analysts by providing an alternative frame in which to understand seemingly naturalistic ideologies within a text, students were aware of the relationships between language and power. This position was facilitated through discourses and ideologies presented in the revised curriculum, which assisted them in identifying and analyzing the social, cultural, and political aspects of language. This curriculum included the revised peer conference sheet, history curriculum, and personal experiences. In taking up the critical language analyst subject position, students worked toward a critical and complex understanding of language and power not provided by traditional peer conferencing theory and practice. In doing so, students created a more democratic classroom in which students realized their power and authority to promote social change through language
Context-Aware and Secure Workflow Systems
Businesses do evolve. Their evolution necessitates the re-engineering of their existing "business processes”, with the objectives of reducing costs, delivering services on time, and enhancing their profitability in a competitive market. This is generally true and particularly in domains such as manufacturing, pharmaceuticals and education). The central objective of workflow technologies is to separate business policies (which normally are encoded in business logics) from the underlying business applications. Such a separation is desirable as it improves the evolution of business processes and, more often than not, facilitates the re-engineering at the organisation level without the need to detail knowledge or analyses of the application themselves. Workflow systems are currently used by many organisations with a wide range of interests and specialisations in many domains. These include, but not limited to, office automation, finance and banking sector, health-care, art, telecommunications, manufacturing and education. We take the view that a workflow is a set of "activities”, each performs a piece of functionality within a given "context” and may be constrained by some security requirements. These activities are coordinated to collectively achieve a required business objective. The specification of such coordination is presented as a set of "execution constraints” which include parallelisation (concurrency/distribution), serialisation, restriction, alternation, compensation and so on. Activities within workflows could be carried out by humans, various software based application programs, or processing entities according to the organisational rules, such as meeting deadlines or performance improvement. Workflow execution can involve a large number of different participants, services and devices which may cross the boundaries of various organisations and accessing variety of data.
This raises the importance of
_ context variations and context-awareness and
_ security (e.g. access control and privacy).
The specification of precise rules, which prevent unauthorised participants from executing sensitive tasks and also to prevent tasks from accessing unauthorised services or (commercially) sensitive information, are crucially important. For example, medical scenarios will require that:
_ only authorised doctors are permitted to perform certain tasks,
_ a patient medical records are not allowed to be accessed by anyone without
the patient consent and
_ that only specific machines are used to perform given tasks at a given time.
If a workflow execution cannot guarantee these requirements, then the flow will
be rejected. Furthermore, features/characteristics of security requirement are both
temporal- and/or event-related. However, most of the existing models are of a
static nature – for example, it is hard, if not impossible, to express security requirements which are:
_ time-dependent (e.g. A customer is allowed to be overdrawn by 100 pounds
only up-to the first week of every month.
_ event-dependent (e.g. A bank account can only be manipulated by its owner unless there is a change in the law or after six months of his/her death).
Currently, there is no commonly accepted model for secure and context-aware workflows or even a common agreement on which features a workflow security model should support. We have developed a novel approach to design, analyse and validate workflows. The approach has the following components:
= A modelling/design language (known as CS-Flow).
The language has the following features:
– support concurrency;
– context and context awareness are first-class citizens;
– supports mobility as activities can move from one context to another;
– has the ability to express timing constrains: delay, deadlines, priority and schedulability;
– allows the expressibility of security policies (e.g. access control and privacy) without the need for extra linguistic complexities; and
– enjoy sound formal semantics that allows us to animate designs and compare various designs.
= An approach known as communication-closed layer is developed, that allows us to serialise a highly distributed workflow to produce a semantically equivalent quasi-sequential flow which is easier to understand and analyse. Such re-structuring, gives us a mechanism to design fault-tolerant workflows as layers are atomic activities and various existing forward and backward error recovery techniques can be deployed.
= Provide a reduction semantics to CS-Flow that allows us to build a tool support to animate a specifications and designs. This has been evaluated on a Health care scenario, namely the Context Aware Ward (CAW) system. Health care provides huge amounts of business workflows, which will benefit from workflow adaptation and support through pervasive computing systems. The evaluation takes two complementary strands:
– provide CS-Flow’s models and specifications and
– formal verification of time-critical component of a workflow
OPTIMIZATION OF NONSTANDARD REASONING SERVICES
The increasing adoption of semantic technologies and the corresponding increasing complexity of application requirements are motivating extensions to the standard reasoning paradigms and services supported by such technologies. This thesis focuses on two of such extensions: nonmonotonic reasoning and inference-proof access control.
Expressing knowledge via general rules that admit exceptions is an approach that has been commonly adopted for centuries in areas such as law and science, and more recently in object-oriented programming and computer security. The experiences in developing complex biomedical knowledge bases reported in the literature show that a direct support to defeasible properties and exceptions would be of great help.
On the other hand, there is ample evidence of the need for knowledge confidentiality measures. Ontology languages and Linked Open Data are increasingly being used to encode the private knowledge of companies and public organizations. Semantic Web techniques facilitate merging different sources of knowledge and extract implicit information, thereby putting at risk security and the privacy of individuals. But the same reasoning capabilities can be exploited to protect the confidentiality of knowledge.
Both nonmonotonic inference and secure knowledge base access rely on nonstandard reasoning procedures. The design and realization of these algorithms in a scalable way (appropriate to the ever-increasing size of ontologies and knowledge bases) is carried out by means of a diversified range of optimization techniques such as appropriate module extraction and incremental reasoning. Extensive experimental evaluation shows the efficiency of the developed optimization techniques: (i) for the first time performance compatible with real-time reasoning is obtained for large nonmonotonic ontologies, while (ii) the secure ontology access control proves to be already compatible with practical use in the e-health application scenario.
Exploring Adolescent Student Perceptions and Experiences of Educational Care
Despite the presence of teacher caring intentions, too many students in North American schools do not experience successfully communicated care from their teachers. This study explores adolescent student perceptions and experiences of their teachers’ intended communication of care, seeking to better understand and explain educational care. The results of this study provide insights that could help teachers more successfully communicate their intended care to their students, leading to the development of caring teacher-student relationships. This study is a qualitative research design that used a constructivist grounded theory research methodology (Charmaz, 2006, 2014). The study employed unstructured interviews, working with young adult participants who reflected on their perceptions and experiences of educational care while they were in middle school and high school. The study drew on constructivist grounded theory analysis approaches and processes in order to analyze the data, resulting in important descriptions and explanations. The study generated six primary results, (1) a rearticulation of the problem of care in education as the disconnect between teacher caring intentions and student experiences of educational care; (2) a recognition that the problem of educational care is the failure to differentiate between communicating intended care and completing of successfully communicating care (a process that includes the response of the cared-for); (3) a description of the successful communication of care, which includes three distinct categories or dimensions and a number of related sub-categories, or elements; (4) a grounded theory of the intended communication of educational care; (5) a description of the student’s role in the development of a caring teacher-student relationship; and (6) a theoretical explanation of the development of a caring teacher-student relationship. The results of the study provide important insights into how educational care is successfully communicated and how caring teacher-student relationships can be developed. These results have implications for in-service and pre-service teachers, providing them with knowledge about the nature and communication of educational care. The results also provide guidelines and resources that can help teachers to communicate care more effectively and successfully
- …