11,137 research outputs found

    An exploration of the language within Ofsted reports and their influence on primary school performance in mathematics: a mixed methods critical discourse analysis

    Get PDF
    This thesis contributes to the understanding of the language of Ofsted reports, their similarity to one another and associations between different terms used within ‘areas for improvement’ sections and subsequent outcomes for pupils. The research responds to concerns from serving headteachers that Ofsted reports are overly similar, do not capture the unique story of their school, and are unhelpful for improvement. In seeking to answer ‘how similar are Ofsted reports’ the study uses two tools, a plagiarism detection software (Turnitin) and a discourse analysis tool (NVivo) to identify trends within and across a large corpus of reports. The approach is based on critical discourse analysis (Van Dijk, 2009; Fairclough, 1989) but shaped in the form of practitioner enquiry seeking power in the form of impact on pupils and practitioners, rather than a more traditional, sociological application of the method. The research found that in 2017, primary school section 5 Ofsted reports had more than half of their content exactly duplicated within other primary school inspection reports published that same year. Discourse analysis showed the quality assurance process overrode variables such as inspector designation, gender, or team size, leading to three distinct patterns of duplication: block duplication, self-referencing, and template writing. The most unique part of a report was found to be the ‘area for improvement’ section, which was tracked to externally verified outcomes for pupils using terms linked to ‘mathematics’. Those required to improve mathematics in their areas for improvement improved progress and attainment in mathematics significantly more than national rates. These findings indicate that there was a positive correlation between the inspection reporting process and a beneficial impact on pupil outcomes in mathematics, and that the significant similarity of one report to another had no bearing on the usefulness of the report for school improvement purposes within this corpus

    Technical Dimensions of Programming Systems

    Get PDF
    Programming requires much more than just writing code in a programming language. It is usually done in the context of a stateful environment, by interacting with a system through a graphical user interface. Yet, this wide space of possibilities lacks a common structure for navigation. Work on programming systems fails to form a coherent body of research, making it hard to improve on past work and advance the state of the art. In computer science, much has been said and done to allow comparison of programming languages, yet no similar theory exists for programming systems; we believe that programming systems deserve a theory too. We present a framework of technical dimensions which capture the underlying characteristics of programming systems and provide a means for conceptualizing and comparing them. We identify technical dimensions by examining past influential programming systems and reviewing their design principles, technical capabilities, and styles of user interaction. Technical dimensions capture characteristics that may be studied, compared and advanced independently. This makes it possible to talk about programming systems in a way that can be shared and constructively debated rather than relying solely on personal impressions. Our framework is derived using a qualitative analysis of past programming systems. We outline two concrete ways of using our framework. First, we show how it can analyze a recently developed novel programming system. Then, we use it to identify an interesting unexplored point in the design space of programming systems. Much research effort focuses on building programming systems that are easier to use, accessible to non-experts, moldable and/or powerful, but such efforts are disconnected. They are informal, guided by the personal vision of their authors and thus are only evaluable and comparable on the basis of individual experience using them. By providing foundations for more systematic research, we can help programming systems researchers to stand, at last, on the shoulders of giants

    Composing games into complex institutions

    Get PDF
    Game theory is used by all behavioral sciences, but its development has long centered around tools for relatively simple games and toy systems, such as the economic interpretation of equilibrium outcomes. Our contribution, compositional game theory, permits another approach of equally general appeal: the high-level design of large games for expressing complex architectures and representing real-world institutions faithfully. Compositional game theory, grounded in the mathematics underlying programming languages, and introduced here as a general computational framework, increases the parsimony of game representations with abstraction and modularity, accelerates search and design, and helps theorists across disciplines express real-world institutional complexity in well-defined ways. Relative to existing approaches in game theory, compositional game theory is especially promising for solving game systems with long-range dependencies, for comparing large numbers of structurally related games, and for nesting games into the larger logical or strategic flows typical of real world policy or institutional systems.Comment: ~4000 words, 6 figure

    Text world creation in advertising discourse

    Get PDF
    This article explores the way in which text worlds are created in advertising discourse by analysing linguistic choices and features of context which are crucial in the determination of specific relations between sender(s) and target audience(s), in particular, deixis and frame knowledge. The argument is that a text world model is particularly adequate for the description of the way in which advertising discourse is processed in an active, dynamic, context-dependent way. In this process, addressees reconstruct the world projected in the discourse according to their own cultural and personal knowledge from the linguistic and visual clues provided in the advertisement

    Deep Transfer Learning Applications in Intrusion Detection Systems: A Comprehensive Review

    Full text link
    Globally, the external Internet is increasingly being connected to the contemporary industrial control system. As a result, there is an immediate need to protect the network from several threats. The key infrastructure of industrial activity may be protected from harm by using an intrusion detection system (IDS), a preventive measure mechanism, to recognize new kinds of dangerous threats and hostile activities. The most recent artificial intelligence (AI) techniques used to create IDS in many kinds of industrial control networks are examined in this study, with a particular emphasis on IDS-based deep transfer learning (DTL). This latter can be seen as a type of information fusion that merge, and/or adapt knowledge from multiple domains to enhance the performance of the target task, particularly when the labeled data in the target domain is scarce. Publications issued after 2015 were taken into account. These selected publications were divided into three categories: DTL-only and IDS-only are involved in the introduction and background, and DTL-based IDS papers are involved in the core papers of this review. Researchers will be able to have a better grasp of the current state of DTL approaches used in IDS in many different types of networks by reading this review paper. Other useful information, such as the datasets used, the sort of DTL employed, the pre-trained network, IDS techniques, the evaluation metrics including accuracy/F-score and false alarm rate (FAR), and the improvement gained, were also covered. The algorithms, and methods used in several studies, or illustrate deeply and clearly the principle in any DTL-based IDS subcategory are presented to the reader

    Strategies for Early Learners

    Get PDF
    Welcome to learning about how to effectively plan curriculum for young children. This textbook will address: • Developing curriculum through the planning cycle • Theories that inform what we know about how children learn and the best ways for teachers to support learning • The three components of developmentally appropriate practice • Importance and value of play and intentional teaching • Different models of curriculum • Process of lesson planning (documenting planned experiences for children) • Physical, temporal, and social environments that set the stage for children’s learning • Appropriate guidance techniques to support children’s behaviors as the self-regulation abilities mature. • Planning for preschool-aged children in specific domains including o Physical development o Language and literacy o Math o Science o Creative (the visual and performing arts) o Diversity (social science and history) o Health and safety • Making children’s learning visible through documentation and assessmenthttps://scholar.utc.edu/open-textbooks/1001/thumbnail.jp

    Supernatural crossing in Republican Chinese fiction, 1920s–1940s

    Get PDF
    This dissertation studies supernatural narratives in Chinese fiction from the mid-1920s to the 1940s. The literary works present phenomena or elements that are or appear to be supernatural, many of which remain marginal or overlooked in Sinophone and Anglophone academia. These sources are situated in the May Fourth/New Culture ideological context, where supernatural narratives had to make way for the progressive intellectuals’ literary realism and their allegorical application of supernatural motifs. In the face of realism, supernatural narratives paled, dismissed as impractical fantasies that distract one from facing and tackling real life. Nevertheless, I argue that the supernatural narratives do not probe into another mystical dimension that might co-exist alongside the empirical world. Rather, they imagine various cases of the characters’ crossing to voice their discontent with contemporary society or to reflect on the notion of reality. “Crossing” relates to characters’ acts or processes of trespassing the boundary that separates the supernatural from the conventional natural world, thus entailing encounters and interaction between the natural and the supernatural. The dissertation examines how crossing, as a narrative device, disturbs accustomed and mundane situations, releases hidden tensions, and discloses repressed truths in Republican fiction. There are five types of crossing in the supernatural narratives. Type 1 is the crossing into “haunted” houses. This includes (intangible) human agency crossing into domestic spaces and revealing secrets and truths concealed by the scary, feigned ‘haunting’, thus exposing the hidden evil and the other house occupiers’ silenced, suffocated state. Type 2 is men crossing into female ghosts’ apparitional residences. The female ghosts allude to heart-breaking, traumatic experiences in socio-historical reality, evoking sympathetic concern for suffering individuals who are caught in social upheavals. Type 3 is the crossing from reality into the characters’ delusional/hallucinatory realities. While they physically remain in the empirical world, the characters’ abnormal perceptions lead them to exclusive, delirious, and quasi-supernatural experiences of reality. Their crossings blur the concrete boundaries between the real and the unreal on the mental level: their abnormal perceptions construct a significant, meaningful reality for them, which may be as real as the commonly regarded objective reality. Type 4 is the crossing into the netherworld modelled on the real world in the authors’ observation and bears a spectrum of satirised objects of the Republican society. The last type is immortal visitors crossing into the human world. This type satirises humanity’s vices and destructive potential. The primary sources demonstrate their writers’ witty passion to play with super--natural notions and imagery (such as ghosts, demons, and immortals) and stitch them into vivid, engaging scenes using techniques such as the gothic, the grotesque, and the satirical, in order to evoke sentiments such as terror, horror, disgust, dis--orientation, or awe, all in service of their insights into realist issues. The works also creatively tailor traditional Chinese modes and motifs, which exemplifies the revival of Republican interest in traditional cultural heritage. The supernatural narratives may amaze or disturb the reader at first, but what is more shocking, unpleasantly nudging, or thought-provoking is the problematic society and people’s lives that the supernatural (misunderstandings) eventually reveals. They present a more compre--hensive treatment of reality than Republican literature with its revolutionary consciousness surrounding class struggle. The critical perspectives of the supernatural narratives include domestic space, unacknowledged history and marginal individuals, abnormal mentality, and pervasive weaknesses in humanity. The crossing and supernatural narratives function as a means of better understanding the lived reality. This study gathers diverse primary sources written by Republican writers from various educational and political backgrounds and interprets them from a rare perspective, thus filling a research gap. It promotes a fuller view of supernatural narratives in twentieth-century Chinese literature. In terms of reflecting the social and personal reality of the Republican era, the supernatural narratives supplement the realist fiction of the time

    Foundations for programming and implementing effect handlers

    Get PDF
    First-class control operators provide programmers with an expressive and efficient means for manipulating control through reification of the current control state as a first-class object, enabling programmers to implement their own computational effects and control idioms as shareable libraries. Effect handlers provide a particularly structured approach to programming with first-class control by naming control reifying operations and separating from their handling. This thesis is composed of three strands of work in which I develop operational foundations for programming and implementing effect handlers as well as exploring the expressive power of effect handlers. The first strand develops a fine-grain call-by-value core calculus of a statically typed programming language with a structural notion of effect types, as opposed to the nominal notion of effect types that dominates the literature. With the structural approach, effects need not be declared before use. The usual safety properties of statically typed programming are retained by making crucial use of row polymorphism to build and track effect signatures. The calculus features three forms of handlers: deep, shallow, and parameterised. They each offer a different approach to manipulate the control state of programs. Traditional deep handlers are defined by folds over computation trees, and are the original con-struct proposed by Plotkin and Pretnar. Shallow handlers are defined by case splits (rather than folds) over computation trees. Parameterised handlers are deep handlers extended with a state value that is threaded through the folds over computation trees. To demonstrate the usefulness of effects and handlers as a practical programming abstraction I implement the essence of a small UNIX-style operating system complete with multi-user environment, time-sharing, and file I/O. The second strand studies continuation passing style (CPS) and abstract machine semantics, which are foundational techniques that admit a unified basis for implementing deep, shallow, and parameterised effect handlers in the same environment. The CPS translation is obtained through a series of refinements of a basic first-order CPS translation for a fine-grain call-by-value language into an untyped language. Each refinement moves toward a more intensional representation of continuations eventually arriving at the notion of generalised continuation, which admit simultaneous support for deep, shallow, and parameterised handlers. The initial refinement adds support for deep handlers by representing stacks of continuations and handlers as a curried sequence of arguments. The image of the resulting translation is not properly tail-recursive, meaning some function application terms do not appear in tail position. To rectify this the CPS translation is refined once more to obtain an uncurried representation of stacks of continuations and handlers. Finally, the translation is made higher-order in order to contract administrative redexes at translation time. The generalised continuation representation is used to construct an abstract machine that provide simultaneous support for deep, shallow, and parameterised effect handlers. kinds of effect handlers. The third strand explores the expressiveness of effect handlers. First, I show that deep, shallow, and parameterised notions of handlers are interdefinable by way of typed macro-expressiveness, which provides a syntactic notion of expressiveness that affirms the existence of encodings between handlers, but it provides no information about the computational content of the encodings. Second, using the semantic notion of expressiveness I show that for a class of programs a programming language with first-class control (e.g. effect handlers) admits asymptotically faster implementations than possible in a language without first-class control

    JESUS AND THE VISIBILITY OF GOD: SIGHT AND BELIEF IN THE FOURTH GOSPEL

    Get PDF
    This thesis establishes the value of the physical incarnation of God for belief. It asserts that the theological nature of belief derives from a God who can make himself physically visible in the world. While scholars have often debated the relationship between the empirical senses and belief in John, few have queried the presuppositions about God’s invisibility that inform their positions. In response, this thesis argues across six chapters that unless God becomes physically visible in Jesus, belief does not obtain. Chapter 1 shows that God himself is ultimately the cause, content, and consequence of the belief that John 20:30-31 describes as the purpose of the Gospel. It establishes the theological nature of belief and thus the fact that the Gospel endeavours to draw humanity close to God via faith in Jesus. The remaining five chapters argue that seeing God in Jesus is both possible and desirable. Chapter 2 re-evaluates the metaphysics of divine visibility in Early Judaism and in John and concludes that God can be physically visible in Jesus’s body. John does not regard divinity as invisible in itself; rather, he claims that seeing Jesus is seeing God. Two long chapters follow and substantiate the claims of Chapter 2. They point up the entwined nature of divine presence and material reality by arguing that Jesus’s body is a divine place. This fact – coupled with John’s depiction of Jesus as a man in divine places – stresses his divinity on earth even as it reveals his localized humanity. Chapter 5 argues that sight itself is the primary catalyst for belief in John. Although human hearts occlude proper vision, seeing remains key to human apprehension of God and belief in him. Chapter 6 draws the foregoing together by arguing that seeing Jesus is seeing God across the Johannine narratives, both despite and because of their deeply counterintuitive climax in the crucifixion
    • …
    corecore