15,105 research outputs found

    An exploration of the language within Ofsted reports and their influence on primary school performance in mathematics: a mixed methods critical discourse analysis

    Get PDF
    This thesis contributes to the understanding of the language of Ofsted reports, their similarity to one another and associations between different terms used within ‘areas for improvement’ sections and subsequent outcomes for pupils. The research responds to concerns from serving headteachers that Ofsted reports are overly similar, do not capture the unique story of their school, and are unhelpful for improvement. In seeking to answer ‘how similar are Ofsted reports’ the study uses two tools, a plagiarism detection software (Turnitin) and a discourse analysis tool (NVivo) to identify trends within and across a large corpus of reports. The approach is based on critical discourse analysis (Van Dijk, 2009; Fairclough, 1989) but shaped in the form of practitioner enquiry seeking power in the form of impact on pupils and practitioners, rather than a more traditional, sociological application of the method. The research found that in 2017, primary school section 5 Ofsted reports had more than half of their content exactly duplicated within other primary school inspection reports published that same year. Discourse analysis showed the quality assurance process overrode variables such as inspector designation, gender, or team size, leading to three distinct patterns of duplication: block duplication, self-referencing, and template writing. The most unique part of a report was found to be the ‘area for improvement’ section, which was tracked to externally verified outcomes for pupils using terms linked to ‘mathematics’. Those required to improve mathematics in their areas for improvement improved progress and attainment in mathematics significantly more than national rates. These findings indicate that there was a positive correlation between the inspection reporting process and a beneficial impact on pupil outcomes in mathematics, and that the significant similarity of one report to another had no bearing on the usefulness of the report for school improvement purposes within this corpus

    Technical Dimensions of Programming Systems

    Get PDF
    Programming requires much more than just writing code in a programming language. It is usually done in the context of a stateful environment, by interacting with a system through a graphical user interface. Yet, this wide space of possibilities lacks a common structure for navigation. Work on programming systems fails to form a coherent body of research, making it hard to improve on past work and advance the state of the art. In computer science, much has been said and done to allow comparison of programming languages, yet no similar theory exists for programming systems; we believe that programming systems deserve a theory too. We present a framework of technical dimensions which capture the underlying characteristics of programming systems and provide a means for conceptualizing and comparing them. We identify technical dimensions by examining past influential programming systems and reviewing their design principles, technical capabilities, and styles of user interaction. Technical dimensions capture characteristics that may be studied, compared and advanced independently. This makes it possible to talk about programming systems in a way that can be shared and constructively debated rather than relying solely on personal impressions. Our framework is derived using a qualitative analysis of past programming systems. We outline two concrete ways of using our framework. First, we show how it can analyze a recently developed novel programming system. Then, we use it to identify an interesting unexplored point in the design space of programming systems. Much research effort focuses on building programming systems that are easier to use, accessible to non-experts, moldable and/or powerful, but such efforts are disconnected. They are informal, guided by the personal vision of their authors and thus are only evaluable and comparable on the basis of individual experience using them. By providing foundations for more systematic research, we can help programming systems researchers to stand, at last, on the shoulders of giants

    Self-Supervised Learning to Prove Equivalence Between Straight-Line Programs via Rewrite Rules

    Full text link
    We target the problem of automatically synthesizing proofs of semantic equivalence between two programs made of sequences of statements. We represent programs using abstract syntax trees (AST), where a given set of semantics-preserving rewrite rules can be applied on a specific AST pattern to generate a transformed and semantically equivalent program. In our system, two programs are equivalent if there exists a sequence of application of these rewrite rules that leads to rewriting one program into the other. We propose a neural network architecture based on a transformer model to generate proofs of equivalence between program pairs. The system outputs a sequence of rewrites, and the validity of the sequence is simply checked by verifying it can be applied. If no valid sequence is produced by the neural network, the system reports the programs as non-equivalent, ensuring by design no programs may be incorrectly reported as equivalent. Our system is fully implemented for a given grammar which can represent straight-line programs with function calls and multiple types. To efficiently train the system to generate such sequences, we develop an original incremental training technique, named self-supervised sample selection. We extensively study the effectiveness of this novel training approach on proofs of increasing complexity and length. Our system, S4Eq, achieves 97% proof success on a curated dataset of 10,000 pairs of equivalent programsComment: 30 pages including appendi

    Constructing a Theological Framework That Revitalizes the Missional Nature of Churches of Christ in South Australia

    Get PDF
    This thesis addresses the need for a theological framework that revitalizes the missional nature of Churches of Christ in South Australia. The problem identified within this ministry context was a lack of clear theological principles that informed a common understanding of identity for missional engagement. The purpose of the project was to create a study guide that informs common theological commitments and grounds congregations for missional vitality. A research and development team made up of seven Church of Christ ministers from different backgrounds was assembled to design a curriculum that addressed the problem. Through eight two-hour sessions over four months in the first half of 2022, the team discussed a theological framework that could revitalize mission. This was informed by a Trinitarian theological rationale introduced as perichoresis. The conceptual framework for discussions included (1) the historical and theological foundations of Churches of Christ, (2) a Trinitarian doctrine of God presented as perichoresis, (3) contemporary congregational practices, and (4) a theological proposal for re-imagining mission. The team developed a study guide that promotes a dynamic theological framework for practicing theology and revitalizing the missional nature of the church. The artifact, Movement & Identity: Participating in the Life of God’s Mission, was evaluated by the team and members of Church of Christ congregations in South Australia. The curriculum is designed to assist participants with practical theological interpretation through (1) discovering new ideas about God in the context of Churches of Christ traditions, (2) engaging with contextual theology in community, (3) participating in God’s mission, and (4) reflecting on how God’s agency transforms the church. The development of the study guide will stimulate a practical theological framework that promotes dynamic theological dialogue and missional vitality for Churches of Christ in South Australia

    Strategies for Early Learners

    Get PDF
    Welcome to learning about how to effectively plan curriculum for young children. This textbook will address: • Developing curriculum through the planning cycle • Theories that inform what we know about how children learn and the best ways for teachers to support learning • The three components of developmentally appropriate practice • Importance and value of play and intentional teaching • Different models of curriculum • Process of lesson planning (documenting planned experiences for children) • Physical, temporal, and social environments that set the stage for children’s learning • Appropriate guidance techniques to support children’s behaviors as the self-regulation abilities mature. • Planning for preschool-aged children in specific domains including o Physical development o Language and literacy o Math o Science o Creative (the visual and performing arts) o Diversity (social science and history) o Health and safety • Making children’s learning visible through documentation and assessmenthttps://scholar.utc.edu/open-textbooks/1001/thumbnail.jp

    Foundations for programming and implementing effect handlers

    Get PDF
    First-class control operators provide programmers with an expressive and efficient means for manipulating control through reification of the current control state as a first-class object, enabling programmers to implement their own computational effects and control idioms as shareable libraries. Effect handlers provide a particularly structured approach to programming with first-class control by naming control reifying operations and separating from their handling. This thesis is composed of three strands of work in which I develop operational foundations for programming and implementing effect handlers as well as exploring the expressive power of effect handlers. The first strand develops a fine-grain call-by-value core calculus of a statically typed programming language with a structural notion of effect types, as opposed to the nominal notion of effect types that dominates the literature. With the structural approach, effects need not be declared before use. The usual safety properties of statically typed programming are retained by making crucial use of row polymorphism to build and track effect signatures. The calculus features three forms of handlers: deep, shallow, and parameterised. They each offer a different approach to manipulate the control state of programs. Traditional deep handlers are defined by folds over computation trees, and are the original con-struct proposed by Plotkin and Pretnar. Shallow handlers are defined by case splits (rather than folds) over computation trees. Parameterised handlers are deep handlers extended with a state value that is threaded through the folds over computation trees. To demonstrate the usefulness of effects and handlers as a practical programming abstraction I implement the essence of a small UNIX-style operating system complete with multi-user environment, time-sharing, and file I/O. The second strand studies continuation passing style (CPS) and abstract machine semantics, which are foundational techniques that admit a unified basis for implementing deep, shallow, and parameterised effect handlers in the same environment. The CPS translation is obtained through a series of refinements of a basic first-order CPS translation for a fine-grain call-by-value language into an untyped language. Each refinement moves toward a more intensional representation of continuations eventually arriving at the notion of generalised continuation, which admit simultaneous support for deep, shallow, and parameterised handlers. The initial refinement adds support for deep handlers by representing stacks of continuations and handlers as a curried sequence of arguments. The image of the resulting translation is not properly tail-recursive, meaning some function application terms do not appear in tail position. To rectify this the CPS translation is refined once more to obtain an uncurried representation of stacks of continuations and handlers. Finally, the translation is made higher-order in order to contract administrative redexes at translation time. The generalised continuation representation is used to construct an abstract machine that provide simultaneous support for deep, shallow, and parameterised effect handlers. kinds of effect handlers. The third strand explores the expressiveness of effect handlers. First, I show that deep, shallow, and parameterised notions of handlers are interdefinable by way of typed macro-expressiveness, which provides a syntactic notion of expressiveness that affirms the existence of encodings between handlers, but it provides no information about the computational content of the encodings. Second, using the semantic notion of expressiveness I show that for a class of programs a programming language with first-class control (e.g. effect handlers) admits asymptotically faster implementations than possible in a language without first-class control

    Global Winning Conditions in Synthesis of Distributed Systems with Causal Memory

    Get PDF
    In the synthesis of distributed systems, we automate the development of distributed programs and hardware by automatically deriving correct implementations from formal specifications. For synchronous distributed systems, the synthesis problem is well known to be undecidable. For asynchronous systems, the boundary between decidable and undecidable synthesis problems is a long-standing open question. We study the problem in the setting of Petri games, a framework for distributed systems where asynchronous processes are equipped with causal memory. Petri games extend Petri nets with a distinction between system places and environment places. The components of a distributed system are the players of the game, represented as tokens that exchange information during each synchronization. Previous decidability results for this model are limited to local winning conditions, i.e., conditions that only refer to individual components. In this paper, we consider global winning conditions such as mutual exclusion, i.e., conditions that refer to the state of all components. We provide decidability and undecidability results for global winning conditions. First, we prove for winning conditions given as bad markings that it is decidable whether a winning strategy for the system players exists in Petri games with a bounded number of system players and one environment player. Second, we prove for winning conditions that refer to both good and bad markings that it is undecidable whether a winning strategy for the system players exists in Petri games with at least two system players and one environment player. Our results thus show that, on the one hand, it is indeed possible to use global safety specifications like mutual exclusion in the synthesis of distributed systems. However, on the other hand, adding global liveness specifications results in an undecidable synthesis problem for almost all Petri games

    A productive response to legacy system petrification

    Get PDF
    Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets
    • …
    corecore