16,686 research outputs found

    Collaborative Creation of Teaching-Learning Sequences and an Atlas of Knowledge

    Get PDF
    The article is about a new online resource, a collaborative portal for teachers, which publishes a network of prerequisites for teaching/learning any concept or an activity. A simple and effective method of collaboratively constructing teaching­-learning sequences is presented. The special emergent properties of the dependency network and their didactic and epistemic implications are pointed. The article ends with an appeal to the global teaching community to contribute prerequisites of any subject to complete the global roadmap for an altas being built on similar lines as Wikipedia. The portal is launched and waiting for community participation at http://www.gnowledge.org.\u

    Comparing Visual and Statistical Analysis in Single-Case Studies Using Published Studies

    Get PDF
    Little is known about the extent to which interrupted time-series analysis (ITSA) can be applied to short, single-case study designs and whether those applications produce results consistent with visual analysis (VA). This paper examines the extent to which ITSA can be applied to single-case study designs and compares the results based on two methods: ITSA and VA, using papers published in the Journal of Applied Behavior Analysis in 2010. The study was made possible by the development of software called UnGraph® which facilitates the recovery of raw data from the graphs. ITSA was successfully applied to 94% of the examined graphs with the number of observations ranging from 8 to 136. Moderate to high lag 1 autocorrelations (\u3e .50) were found for 46% of the data series. Effect sizes similar to group-level Cohen’s d were identified based on the tertile distribution. Effects ranging from 0.00 to 0.99 were classified as small, those ranging from 1.00 to 2.49 as medium, and large effect sizes were defined as 2.50 or greater. Comparison of the conclusions from VA and ITSA had a low level of agreement (Kappa = .14, accounting for the agreement expected by chance). The results demonstrate that ITSA can be broadly implemented in applied behavior analysis research. These two methods should be viewed as complimentary and used concurrently

    Extending Morris Method: identification of the interaction graph using cycle-equitabe designs

    Get PDF
    International audienceThe paper presents designs that allow detection of mixed effects when performing preliminary screening of the inputs of a scalar function of dd input factors, in the spirit of Morris' Elementary Effects approach. We introduce the class of (d,c)(d,c)-cycle equitable designs as those that enable computation of exactly cc second order effects on all possible pairs of input factors. Using these designs, we propose a fast Mixed Effects screening method, that enables efficient identification of the interaction graph of the input variables. Design definition is formally supported on the establishment of an isometry between sub-graphs of the unit cube QdQ_d equipped of the Manhattan metric, and a set of polynomials in (X1,…,Xd)(X_1,\ldots, X_d) on which a convenient inner product is defined. In the paper we present systems of equations that recursively define these (d,c)(d,c)-cycle equitable designs for generic values of c≥1c\geq 1, from which direct algorithmic implementations are derived. Application cases are presented, illustrating the application of the proposed designs to the estimation of the interaction graph of specific functions

    Recognising object-oriented software design quality : a practitioner-based questionnaire survey

    Get PDF
    Design quality is vital if software is to be maintainable. What practices do developers actually use to achieve design quality in their day-to-day work and which of these do they find most useful? To discover the extent to which practitioners concern themselves with object-oriented design quality and the approaches used when determining quality in practice, a questionnaire survey of 102 software practitioners, approximately half from the UK and the remainder from elsewhere around the world was used. Individual and peer experience are major contributors to design quality. Classic design guidelines, well-known lower level practices, tools and metrics all can also contribute positively to design quality. There is a potential relationship between testing practices and design quality. Inexperience, time pressures, novel problems, novel technology, and imprecise or changing requirements may have a negative impact on quality. Respondents with most experience are more confident in their design decisions, place more value on reviews by team leads and are more likely to rate design quality as very important. For practitioners, these results identify the techniques and tools that other practitioners find effective. For researchers, the results highlight a need for more work investigating the role of experience in the design process and the contribution experience makes to quality. There is also the potential for more in-depth studies of how practitioners are actually using design guidance, including Clean Code. Lastly, the potential relationship between testing practices and design quality merits further investigation

    Evaluating practitioner cyber-security attack graph configuration preferences

    Get PDF
    Attack graphs and attack trees are a popular method of mathematically and visually rep- resenting the sequence of events that lead to a successful cyber-attack. Despite their popularity, there is no standardised attack graph or attack tree visual syntax configuration, and more than seventy self-nominated attack graph and twenty attack tree configurations have been described in the literature - each of which presents attributes such as preconditions and exploits in a different way. This research proposes a practitioner-preferred attack graph visual syntax configuration which can be used to effectively present cyber-attacks. Comprehensive data on participant ( n=212 ) preferences was obtained through a choice based conjoint design in which participants scored attack graph configuration based on their visual syntax preferences. Data was obtained from multiple participant groups which included lecturers, students and industry practitioners with cyber-security specific or general computer science backgrounds. The overall analysis recommends a winning representation with the following attributes. The flow of events is represented top-down as in a flow diagram - as opposed to a fault tree or attack tree where it is presented bottom-up, preconditions - the conditions required for a successful exploit, are represented as ellipses and exploits are represented as rectangles. These results were consistent across the multiple groups and across scenarios which differed according to their attack complexity. The research tested a number of bottom-up approaches - similar to that used in attack trees. The bottom-up designs received the lowest practitioner preference score indicating that attack trees - which also utilise the bottom-up method, are not a preferred design amongst practitioners - when presented with an alternative top-down design. Practitioner preferences are important for any method or framework to become accepted, and this is the first time that an attack modelling technique has been developed and tested for practitioner preferences

    An Ontological Representation of the Characteristic Problems of Real-Time Systems

    Get PDF
    International audienceSoftware Architectural Assessment is becoming a key discipline to identify at early stages of a system synthesis the most important problems that may become relevant in operation. This matter is especially critical for those systems with real-time constraints. Special emphasis shall be made on concurrency issues. Typical RTOS mechanisms supporting concurrency, such as semaphores or monitors, usually lead to execution time penalties hard to identify, reproduce and solve. For this reason it is crucial to understand the root causes of these problems and to provide support to identify and mitigate them at early stages of the system lifecycle. The main objective of this paper is to propose a new classification of the most important problems related to real-time software systems and to provide mechanisms and guidelines to help engineers improve their architectural designs. The taxonomy has been applied to a particular architectural style (UML-PPOOA) and it is used as a reference to create a new assessment module on the PPOOA- Visio CASE tool [15] to support concurrency problems detection

    Determining the Limits of Automated Program Recognition

    Get PDF
    This working paper was submitted as a Ph.D. thesis proposal.Program recognition is a program understanding technique in which stereotypic computational structures are identified in a program. From this identification and the known relationships between the structures, a hierarchical description of the program's design is recovered. The feasibility of this technique for small programs has been shown by several researchers. However, it seems unlikely that the existing program recognition systems will scale up to realistic, full-sized programs without some guidance (e.g., from a person using the recognition system as an assistant). One reason is that there are limits to what can be recovered by a purely code-driven approach. Some of the information about the program that is useful to know for common software engineering tasks, particularly maintenance, is missing from the code. Another reason guidance must be provided is to reduce the cost of recognition. To determine what guidance is appropriate, therefore, we must know what information is recoverable from the code and where the complexity of program recognition lies. I propose to study the limits of program recognition, both empirically and analytically. First, I will build an experimental system that performs recognition on realistic programs on the order of thousands of lines. This will allow me to characterize the information that can be recovered by this code-driven technique. Second, I will formally analyze the complexity of the recognition process. This will help determine how guidance can be applied most profitably to improve the efficiency of program recognition.MIT Artificial Intelligence Laborator

    Implementation of Slide Lock Design Process at CIS

    Get PDF
    A design process for linear slide mechanisms was developed and synthesized into a manual to be used as a reference by design engineers at Central Industrial Supply. The usage of conceptual design software was incorporated into the design process to enhance creativity and reduce lead time. The created design process manual was used to create and prototype a new rear lock mechanism that reduced push/pull force by 68%, while passing Central Industrial Supply\u27s standardized durability test

    A hybrid algorithm for Bayesian network structure learning with application to multi-label learning

    Get PDF
    We present a novel hybrid algorithm for Bayesian network structure learning, called H2PC. It first reconstructs the skeleton of a Bayesian network and then performs a Bayesian-scoring greedy hill-climbing search to orient the edges. The algorithm is based on divide-and-conquer constraint-based subroutines to learn the local structure around a target variable. We conduct two series of experimental comparisons of H2PC against Max-Min Hill-Climbing (MMHC), which is currently the most powerful state-of-the-art algorithm for Bayesian network structure learning. First, we use eight well-known Bayesian network benchmarks with various data sizes to assess the quality of the learned structure returned by the algorithms. Our extensive experiments show that H2PC outperforms MMHC in terms of goodness of fit to new data and quality of the network structure with respect to the true dependence structure of the data. Second, we investigate H2PC's ability to solve the multi-label learning problem. We provide theoretical results to characterize and identify graphically the so-called minimal label powersets that appear as irreducible factors in the joint distribution under the faithfulness condition. The multi-label learning problem is then decomposed into a series of multi-class classification problems, where each multi-class variable encodes a label powerset. H2PC is shown to compare favorably to MMHC in terms of global classification accuracy over ten multi-label data sets covering different application domains. Overall, our experiments support the conclusions that local structural learning with H2PC in the form of local neighborhood induction is a theoretically well-motivated and empirically effective learning framework that is well suited to multi-label learning. The source code (in R) of H2PC as well as all data sets used for the empirical tests are publicly available.Comment: arXiv admin note: text overlap with arXiv:1101.5184 by other author
    • …
    corecore