139,304 research outputs found

    Modeling Epidemic Spread in Synthetic Populations - Virtual Plagues in Massively Multiplayer Online Games

    Full text link
    A virtual plague is a process in which a behavior-affecting property spreads among characters in a Massively Multiplayer Online Game (MMOG). The MMOG individuals constitute a synthetic population, and the game can be seen as a form of interactive executable model for studying disease spread, albeit of a very special kind. To a game developer maintaining an MMOG, recognizing, monitoring, and ultimately controlling a virtual plague is important, regardless of how it was initiated. The prospect of using tools, methods and theory from the field of epidemiology to do this seems natural and appealing. We will address the feasibility of such a prospect, first by considering some basic measures used in epidemiology, then by pointing out the differences between real world epidemics and virtual plagues. We also suggest directions for MMOG developer control through epidemiological modeling. Our aim is understanding the properties of virtual plagues, rather than trying to eliminate them or mitigate their effects, as would be in the case of real infectious disease.Comment: Accepted for presentation at Digital Games Research Association (DiGRA) conference in Tokyo in September 2007. All comments to the authors (mail addresses are in the paper) are welcom

    Scheduling policies and system software architectures for mixed-criticality computing

    Get PDF
    Mixed-criticality model of computation is being increasingly adopted in timing-sensitive systems. The model not only ensures that the most critical tasks in a system never fails, but also aims for better systems resource utilization in normal condition. In this report, we describe the widely used mixed-criticality task model and fixed-priority scheduling algorithms for the model in uniprocessors. Because of the necessity by the mixed-criticality task model and scheduling policies, isolation, both temporal and spatial, among tasks is one of the main requirements from the system design point of view. Different virtualization techniques have been used to design system software architecture with the goal of isolation. We discuss such a few system software architectures which are being and can be used for mixed-criticality model of computation

    Evaluating Digital Math Tools in the Field

    Get PDF
    Many school districts have adopted digital tools to supplement or replace teacher-led instruction, usually based on the premise that these tools can provide more personalized or individualized experiences for students and at lower cost. Rigorously evaluating whether such initiatives promote better student outcomes in the field is difficult as most schools and teachers are unwilling to enforce rigorous study designs such as randomized control trials. We used study designs that were feasible in practice to assess whether two digital math tools, eSpark and IXL, were associated with improvements in 3rd – 6th grade student test scores in math. We also investigated the resource requirements and costs of implementing eSpark and IXL to assess whether these tools represent a valuable use of resources. We find that while IXL is substantially less costly to implement than eSpark, its use is not significantly associated with students’ math performance

    Resisting the Temptation of Perfection

    Get PDF
    With the advance of CRISPR technology, parents will be tempted to create superior offspring who are healthier, smarter, and stronger. In addition to the fact that many of these procedures are considered immoral for Catholics, they could change human nature in radical and possibly disastrous ways. This article focuses on the question of human perfectionism. First, by considering the relationship between human nature and technology, it analyzes whether such advances can improve human nature in addition to curing diseases. Next, it looks at the moral and spiritual dimensions of perfection by analyzing the cardinal virtues. It argues that seeking perfection in the physical sense alone may not be prudent or wise and may produce greater injustices and weaken the human spirit in the long run. Understanding our true calling to perfection can help us resist the temptation of hubris to enhance the human race through technology

    Housing professionalism in the United Kingdom: the final curtain or a new age?

    Get PDF
    The unusually large, predominantly municipal, housing sector in the UK has provided the context for a large occupational grouping of "housing managers" that has claimed professional status. However, within the post-1945 British welfare state this professional project enjoyed limited success and social housing remained a fragile professional domain. This article explores the consequences for housing professionalism of the recent displacement of the bureau-professional "organisational settlement" by that characterising an emerging "managerial state". Managerialism constitutes a clear challenge to established forms of "professionalism", especially a weak profession such as housing management. However, professionalism is temporally and culturally plastic. Hence, the demands of managerialism, within the specific context of New Labour's quest for "community" cohesion, may be providing opportunities for a new urban network professionalism founded on claims to both generic and specific skills and also a knowledge base combining abstraction with local concreteness. The prominence in these networks of erstwhile "housing" practitioners may become the basis for a new, quite different, professional project. This argument is developed through both conceptual exploration and reference to empirical research. The latter involves reference to recent work by the authors on, first, the perception of housing employers of the changing nature and demands of "housing" work and its consequences for professionalism and, secondly, the professional project implications of the increasing prominence of neighbourhood management.</p

    How to Find More Supernovae with Less Work: Object Classification Techniques for Difference Imaging

    Get PDF
    We present the results of applying new object classification techniques to difference images in the context of the Nearby Supernova Factory supernova search. Most current supernova searches subtract reference images from new images, identify objects in these difference images, and apply simple threshold cuts on parameters such as statistical significance, shape, and motion to reject objects such as cosmic rays, asteroids, and subtraction artifacts. Although most static objects subtract cleanly, even a very low false positive detection rate can lead to hundreds of non-supernova candidates which must be vetted by human inspection before triggering additional followup. In comparison to simple threshold cuts, more sophisticated methods such as Boosted Decision Trees, Random Forests, and Support Vector Machines provide dramatically better object discrimination. At the Nearby Supernova Factory, we reduced the number of non-supernova candidates by a factor of 10 while increasing our supernova identification efficiency. Methods such as these will be crucial for maintaining a reasonable false positive rate in the automated transient alert pipelines of upcoming projects such as PanSTARRS and LSST.Comment: 25 pages; 6 figures; submitted to Ap

    What a Difference a DV Makes ... The Impact of Conceptualizing the Dependent Variable in Innovation Success Factor Studies

    Get PDF
    The quest for the "success factors" that drive a company's innovation performance has attracted a great deal of attention among both practitioners and academics. The underlying assumption is that certain critical activities impact the innovation performance of the company or the project. However, the findings of success factor studies lack convergence. It has been speculated that this may be due to the fact that extant studies have used many different measures of the dependent variable "innovation performance". Our study is the first to analyze this issue systematically and empirically: we analyze the extent to which different conceptualizations of the dependent variable (a firm's innovation performance) lead to different innovation success factor patterns. In order to do so, we collected data from 234 German firms, including well-established success factors and six alternative measures of innovation performance. This allowed us to calculate whether or not success factors are robust to changes in the measurement of the dependent variable. We find that this is not the case: rather, the choice of the dependent variable makes a huge difference. From this, we draw important conclusions for future studies aiming to identify the success factors in companies' innovation performance

    Breadboard model of the LISA phasemeter

    Full text link
    An elegant breadboard model of the LISA phasemeter is currently under development by a Danish-German consortium. The breadboard is build in the frame of an ESA technology development activity to demonstrate the feasibility and readiness of the LISA metrology baseline architecture. This article gives an overview about the breadboard design and its components, including the distribution of key functionalities.Comment: 5 pages, 3 figures, published in ASP Conference Series, Vol. 467, 9th LISA Symposium (2012), pp 271-27

    3D virtual worlds as environments for literacy learning

    Get PDF
    Background: Although much has been written about the ways in which new technology might transform educational practice, particularly in the area of literacy learning, there is relatively little empirical work that explores the possibilities and problems - or even what such a transformation might look like in the classroom. 3D virtual worlds offer a range of opportunities for children to use digital literacies in school, and suggest one way in which we might explore changing literacy practices in a playful, yet meaningful context. Purpose: This paper identifies some of the key issues that emerged in designing and implementing virtual world work in a small number of primary schools in the UK. It examines the tensions between different discourses about literacy and literacy learning and shows how these were played out by teachers and pupils in classroom settings.Sources of evidence: Case study data are used as a basis for exploring and illustrating key aspects of design and implementation. The case study material includes views from a number of perspectives including classroom observations, chatlogs, in-world avatar interviews with teachers and also pupils, as well as the author’s field notes of the planning process with accompanying minutes and meeting documents.Main argument: From a Foucauldian perspective, the article suggests that social control of pedagogical practice through the regulation of curriculum time, the normalisation of teaching routines and the regimes of individual assessment restricts teachers’ and pupils’ conceptions of what constitutes literacy. The counternarrative, found in recent work in new litearcies (Lankshear & Knobel, 2006) provides an attractive alternative, but a movement in this direction requires a fundamental shift of emphasis and a re-conceptualisation of what counts as learning.Conclusions: This work on 3D virtual worlds questions the notion of how transformative practice can be achieved with the use of new technologies. It suggests that changes in teacher preparation, continuing professional development as well as wider educational reform may be needed
    • …
    corecore