633 research outputs found

    Close reading and distance: between invariance and a rhetoric of embodiment

    Get PDF
    ‘Close reading’ of texts has become a central activity of humanities pedagogy and is carried out across different levels of education and through a number of disciplines. The analysis of texts as part of educational practice is sometimes claimed to be a very recent phenomenon, attendant on the formulation of the idea of the text in the early 1960s (Lotman, 1964; Barthes, 1977 [1964]) and, slightly earlier, in the English tradition, with respect to exercises in ‘practical criticism’ (Richards, 1929; Empson, 1930). On the other hand, close reading is associated with a much older tradition dating back to the inception of scriptural exegesis. While educators attempt to inculcate practices of active interpretation, close reading's adherents and advocates often recognize that procedures of close reading can become ossified into routine acts of identifying invariants of textual functioning at the expense of enabling students to intensify and articulate a more engaged relation with the text. In an age of Big Data, statistical analysis and instrumentalization of Higher Education, the intimacy of close reading as a practice is in question. Through survey methods, the research presented here sought to ask what methods of analysis are used in respect of texts in different disciplines, what practices are identified as close reading, what procedures are followed and whether they are common across disciplines, what theoretical, methodological and historiographical frameworks sustain these practices and what educational ethos might be in play. This article will discuss some of the results, not least of which is the finding that the commitment to close reading as a central feature of humanities education does not seem to have waned in the last century, but neither has it reconceptualised reading as anything other than a cerebral exercise in apprehending ‘meaning’ or in developing a disembodied skill. The article briefly contrasts these findings, suggesting a rhetoric of embodiment, mediating the demands for both distance and proximity in reading, as an area for future inquiry

    Low intensity ethnic cleansing in The Netherlands

    Get PDF

    Simulation model of wax diffusion and cleaning in printer belts

    Get PDF
    A belt that transports toner is one of the vital components of a printer. Since toner is fused to the paper at a high temperature, wax releases from the paper and penetrates into the rubber top layer of the belt. When the rubber becomes saturated with wax, the wax remains on top of the belt. The formed layer of wax has negative impact on the image forming unit leading to bad printing quality. Thus, a wax cleaner is installed. To determine optimal functioning of the cleaner, time consuming and inefficient experiments have to be carried out. Thus, an efficient simulation tool to predict wax build-up and cleaning may replace the experiments. Simulation is based on a mathematical model that describes the influx of wax as a convection/diffusion process. The standard numerical discretization methods to calculate the evolution in time of the wax concentration are not applicable. Saturation is reached after ten thousands of rounds. In this article, we propose a combination of an analytical and a numerical method to tackle the problem, where we discretize the second-order differential operator that generates the evolution of the wax concentration. The simulations show an adequate fit with results from measurement. The wax build-up in the belt up to saturation is described realistically. Our study reveals that the contact resistance between belt and cleaner is the most important parameter that influences the effectiveness of the cleaner. Keywords: (Wax) Diffusion – Eigenvalue – Multi-layered medium – Printing belt – Printing syste

    Using unsupervised learning to partition 3D city scenes for distributed building energy microsimulation

    Get PDF
    Microsimulation is a class of Urban Building Energy Modeling techniques in which energetic interactions between buildings are explicitly resolved. Examples include SUNtool and CitySim+, both of which employ a sophisticated radiosity-based algorithm to solve for radiation exchange. The computational cost of this algorithm increases in proportion to the square of the number of surfaces of which an urban scene is comprised. To simulate large scenes, of the order of 10,000 to 1,000,000 surfaces, it is desirable to divide the scene to distribute the simulation task. However, this partitioning is not trivial as the energy-related interactions create uneven inter-dependencies between computing nodes. To this end, we describe in this paper two approaches (K-means and Greedy Community Detection algorithms) for partitioning urban scenes, and subsequently performing building energy microsimulation using CitySim+ on a distributed memory High-Performance Computing Cluster. To compare the performance of these partitioning techniques, we propose two measures evaluating the extent to which the obtained clusters exploit data locality. We show that our approach using Greedy Community Detection performs well in terms of exploiting data locality and reducing inter-dependencies among sub-scenes, but at the expense of a higher data preparation cost and algorithm run-time

    An innovative approach to multi-method integrated assessment modelling of global climate change

    Get PDF
    © 2020, University of Surrey. All rights reserved. Modelling and simulation play an increasingly significant role in exploratory studies for informing policy makers on climate change mitigation strategies. There is considerable research being done in creating Integrated Assessment Models (IAMs), which focus on examining the human impacts on climate change. Many popular IAMs are created as steady state optimisation models. They typically employ a nested structure of neoclassical production functions to represent the energy-economy system, holding aggregate views on variables, and hence are unable to capture a finer level of details of the underlying system components. An alternative approach that allows modelling populations as a collection of individual and unevenly distributed entities is Agent-Based Modelling, often used in the field of Social Simulation. But simulating huge numbers of individual entities can quickly become an issue, as it requires large amounts of computational resources. The goal of this paper is to introduce a conceptual framework for developing hybrid IAMs. This novel modelling approach allows us to reuse existing rigid, but well-established IAMs, and adds more flexibility by replacing aggregate stocks with a community of vibrant interacting entities. We provide a proof-of-concept of the application of this conceptual framework in form of an illustrative example. Our test case takes the settings of the US. It is solely created for the purpose of demonstrating our hybrid modelling approach; we do not claim that it has predictive powers

    Laboratory Focus on Improving the Culture of Biosafety: Statewide Risk Assessment of Clinical Laboratories That Process Specimens for Microbiologic Analysis

    Get PDF
    The Wisconsin State Laboratory of Hygiene challenged Wisconsin laboratories to examine their biosafety practices and improve their culture of biosafety. One hundred three clinical and public health laboratories completed a questionnaire-based, microbiology-focused biosafety risk assessment. Greater than 96% of the respondents performed activities related to specimen processing, direct microscopic examination, and rapid nonmolecular testing, while approximately 60% performed culture interpretation. Although they are important to the assessment of risk, data specific to patient occupation, symptoms, and travel history were often unavailable to the laboratory and, therefore, less contributory to a microbiology-focused biosafety risk assessment than information on the specimen source and test requisition. Over 88% of the respondents complied with more than three-quarters of the mitigation control measures listed in the survey. Facility assessment revealed that subsets of laboratories that claim biosafety level 1, 2, or 3 status did not possess all of the biosafety elements considered minimally standard for their respective classifications. Many laboratories reported being able to quickly correct the minor deficiencies identified. Task assessment identified deficiencies that trended higher within the general (not microbiology-specific) laboratory for core activities, such as packaging and shipping, direct microscopic examination, and culture modalities solely involving screens for organism growth. For traditional microbiology departments, opportunities for improvement in the cultivation and management of highly infectious agents, such as acid-fast bacilli and systemic fungi, were revealed. These results derived from a survey of a large cohort of small- and large-scale laboratories suggest the necessity for continued microbiology-based understanding of biosafety practices, vigilance toward biosafety, and enforcement of biosafety practices throughout the laboratory setting
    • …
    corecore