52 research outputs found
Introduction to Runtime Verification
International audienceThe aim of this chapter is to act as a primer for those wanting to learn about Runtime Verification (RV). We start by providing an overview of the main specification languages used for RV. We then introduce the standard terminology necessary to describe the monitoring problem, covering the pragmatic issues of monitoring and instrumentation, and discussing extensively the monitorability problem
Recommended from our members
Optimising routing and trustworthiness of ad hoc networks using swarm intelligence
This thesis was submitted for the degree of Doctor of Philsophy and awarded by Brunel UniversityThis thesis proposes different approaches to address routing and security of MANETs using swarm technology. The mobility and infrastructure-less of MANET as well as nodes misbehavior compose great challenges to routing and security protocols of such a network. The first approach addresses the problem of channel assignment in multichannel ad hoc networks with limited number of interfaces, where stable route are more preferred to be selected. The channel selection is based on link quality between the nodes. Geographical information is used with mapping algorithm in order to estimate and predict the links’ quality and routes life time, which is combined with Ant Colony Optimization (ACO) algorithm to find most stable route with high data rate. As a result, a better utilization of the channels is performed where the throughput increased up to 74% over ASAR protocol. A new smart data packet routing protocol is developed based on the River Formation Dynamics (RFD) algorithm. The RFD algorithm is a subset of swarm intelligence which mimics how rivers are created in nature. The protocol is a distributed swarm learning approach where data packets are smart enough to guide themselves through best available route in the network. The learning information is distributed throughout the nodes of the network. This information can be used and updated by successive data packets in order to maintain and find better routes. Data packets act like swarm agents (drops) where they carry their path information and update routing information without the need for backward agents. These data packets modify the routing information based on different network metrics. As a result, data packet can guide themselves through better routes.
In the second approach, a hybrid ACO and RFD smart data packet routing protocol is developed where the protocol tries to find shortest path that is less congested to the destination. Simulation results show throughput improvement by 30% over AODV protocol and 13% over AntHocNet. Both delay and jitter have been improved more than 96% over AODV protocol. In order to overcome the problem of source routing introduced due to the use of the ACO algorithm, a solely RFD based distance vector protocol has been developed as a third approach. Moreover, the protocol separates reactive learned information from proactive learned information to add more reliability to data routing. To minimize the power consumption introduced due to the hybrid nature of the RFD routing protocol, a forth approach has been developed. This protocol tackles the problem of power consumption and adds packets delivery power minimization to the protocol based on RFD algorithm.
Finally, a security model based on reputation and trust is added to the smart data packet protocol in order to detect misbehaving nodes. A trust system has been built based on the privilege offered by the RFD algorithm, where drops are always moving from higher altitude to lower one. Moreover, the distributed and undefined nature of the ad hoc network forces the nodes to obligate to cooperative behaviour in order not to be exposed. This system can easily and quickly detect misbehaving nodes according to altitude difference between active intermediate nodes
Working within : the pedagogy and practice of technology professional development
Many researchers have been critical of teachers’ failure to implement computer use effectively in the classroom. In order to question the role that pedagogical issues may play in the success of the implementation process, this study looks at the beliefs of professional developers who are responsible for helping K-12 teachers learn to teach with computers. Five professional developers from Saskatchewan were asked to describe their professional practice by focusing on what they thought effective use of computers was and how they thought their beliefs affected their practice. The heart of the study was the story of the professional developers’ experiences and the way in which their practices evolved over time to meet needs they saw.The professional developers were a diverse group of former teachers. They had taught in a wide variety of settings and for varied lengths of time. They were purposefully selected for involvement in provincial initiatives and providing professional development around computers in their home divisions. The participants shared their experiences through an informal semi-structured interview and follow up questions. The transcripts of the conversations comprised the data, and their examples, statements of belief, and experiences formed the basis for the interpretation of the results.The findings revealed that the professional developers identified both first and second order barriers to the use of computers in classrooms. Each person described a transition from traditional professional development practice to a personal style with the deliberate addition of pedagogical emphasis. They concluded that the current practice of teaching with computers generally did not meet their definition of effective and emphasized the need to question why computers are being used the way they are.The findings from this study indicate that the professional developers believed their pedagogy and practice as professional developers to be intertwined. They also confirmed Coopla’s (2004) argument that pedagogy is the critical first element for effective teaching with computers. From the prospective of the participants, pedagogy, not technology defines how effective the process of integration is in K-12 classrooms
Computer Science 2019 APR Self-Study & Documents
UNM Computer Science APR self-study report and review team report for Spring 2019, fulfilling requirements of the Higher Learning Commission
A system for describing and deciding properties of regular languages using input altering transducers
ii, 94 leaves : ill. ; 29 cm.Includes abstract.Includes bibliographical references (leaves 92-94).We present a formal method for describing and deciding code related properties of regular languages using input altering transducers. We also provide an implementation of that method in the form of a web application. We introduce the concept of an input altering transducer. We show how to use such transducers to describe properties of languages and present examples of transducers describing some well known properties (like suffix codes, prefix codes, infix codes, solid codes, and others). We discuss some limitations of our method. In particular, all properties that can be described using input altering transducers are 3-independence properties. We also give an example of a 3-independence property that cannot be represented using a transducer. We explain how our method is a specialisation of a more general method based on language in-equations. We also discuss the relation between our method and a method that uses sets of trajectories to describe properties. In particular, we show how, for any given set of trajectories describing some property, to build an input altering transducer describing the same property. We introduce the concept of counterexample, which is a pair of words that, if a given language does not belong to a given property, illustrate that fact. We show how we can incorporate extracting such counterexample into our method. Finally, we provide some details on the implementation and usage of the web application that was built as a part of this research
Discovery Over Application: A Case Study of Misaligned Incentives in Software Engineering
In this thesis, we present evidence that there is an under-emphasis on the application of software systems in Software Engineering research, affecting the advancement of the field as a whole. Specifically, we perform a case-study on KLEE, a tool with over 1000 citations. We made improvements that consisted of fixing performance bugs and implementing optimizations that have become common practice, increasing KLEE\u27s performance by 2-11X. To understand how techniques proposed in the literature would be affected by these improvements, we analyzed 100 papers that cited the original KLEE paper. From this analysis we found two things. First, it is clear that coherence to the principles of replication is lacking; it was often very difficult to understand how a particular study used KLEE, and therefore to understand how our improvements would affect the study. Second, when conservatively estimating how the studies relied on KLEE, we believe that seven of the 21 papers that we investigated could have their conclusions significantly strengthened or weakened. Upon closer investigation, six of these seven papers involved studies that directly compared a KLEE or a KLEE dependent tool to some other tool. The potential for mis-application within these competing techniques makes it difficult to understand which observations are true, a situation that potentially leads to wasted effort and slowed progress. To conclude, we examine several recent proposals to address this under-emphasis, using KLEE as an exemplar to understand their likely effects.
Advisers: Matthew B. Dwyer and Sebastian Elbau
Selected Analytical Techniques of Solid State, Structure Identification, and Dissolution Testing in Drug Life Cycle
The textbook provides an overview of the main techniques applied in pharmaceutical industry, with the focus on solid-state analysis. It discusses spectral methods, thermal analysis, and dissolution testing, explains the theoretical background for each method and shows practical examples from a real-life drug-design and quality control applications. The textbook is thus intended for both pharmacy students and early career professionals
Interim research assessment 2003-2005 - Computer Science
This report primarily serves as a source of information for the 2007 Interim Research Assessment Committee for Computer Science at the three technical universities in the Netherlands. The report also provides information for others interested in our research activities
The evaluation of ontologies: quality, reuse and social factors
Finding a “good” or the “right” ontology is a growing challenge in the ontology domain, where one of the main aims is to share and reuse existing semantics and knowledge. Before reusing an ontology, knowledge engineers not only have to find a set of appropriate ontologies for their search query, but they should also be able to evaluate those ontologies according to different internal and external criteria. Therefore, ontology evaluation is at the heart of ontology selection and has received a considerable amount of attention in the literature.Despite the importance of ontology evaluation and selection and the widespread research on these topics, there are still many unanswered questions and challenges when it comes to evaluating and selecting ontologies for reuse. Most of the evaluation metrics and frameworks in the literature are mainly based on a limited set of internal characteristics, e.g., content and structure of ontologies and ignore how they are used and evaluated by communities. This thesis aimed to investigate the notion of quality and reusability in the ontology domain and to explore and identify the set of metrics that can affect the process of ontology evaluation and selection for reuse. [Continues.
- …