27,617 research outputs found

    Meaning-filled metaphors enabling schools to create enhanced learning cultures

    Get PDF
    It is interesting to speculate on metaphor as an instrument capable of facilitating actions leading to powerful consequences. Metaphors remain in the consciousness longer than facts and therefore actions based on specific facts in one context become transferrable to another context through the use of metaphoric symbolism. Current research in schools that have undertaken the Innovative Designs for Enhancing Achievements in Schools (IDEAS) improvement process indicate that collectively developed metaphor use has the dynamic power to facilitate cognitive connections across whole school communities. In so doing, schools engaged in the IDEAS process are developing and utilising significant new knowledge for whole school achievement through cultures of collaboration and commitment. This chapter recognises that when schools are constantly bombarded with the need to undertake substantial changes in practice, the utilisation of a contextual unifying metaphor is capable of assisting wide spread and aligned change processes to unfold

    Guidelines for testing and release procedures

    Get PDF
    Guidelines and procedures are recommended for the testing and release of the types of computer software efforts commonly performed at NASA/Ames Research Center. All recommendations are based on the premise that testing and release activities must be specifically selected for the environment, size, and purpose of each individual software project. Guidelines are presented for building a Test Plan and using formal Test Plan and Test Care Inspections on it. Frequent references are made to NASA/Ames Guidelines for Software Inspections. Guidelines are presented for selecting an Overall Test Approach and for each of the four main phases of testing: (1) Unit Testing of Components, (2) Integration Testing of Components, (3) System Integration Testing, and (4) Acceptance Testing. Tools used for testing are listed, including those available from operating systems used at Ames, specialized tools which can be developed, unit test drivers, stub module generators, and the use of format test reporting schemes

    Principles of Antifragile Software

    Full text link
    The goal of this paper is to study and define the concept of "antifragile software". For this, I start from Taleb's statement that antifragile systems love errors, and discuss whether traditional software dependability fits into this class. The answer is somewhat negative, although adaptive fault tolerance is antifragile: the system learns something when an error happens, and always imrpoves. Automatic runtime bug fixing is changing the code in response to errors, fault injection in production means injecting errors in business critical software. I claim that both correspond to antifragility. Finally, I hypothesize that antifragile development processes are better at producing antifragile software systems.Comment: see https://refuses.github.io

    Similar Sublattices and Coincidence Rotations of the Root Lattice A4 and its Dual

    Get PDF
    A natural way to describe the Penrose tiling employs the projection method on the basis of the root lattice A4 or its dual. Properties of these lattices are thus related to properties of the Penrose tiling. Moreover, the root lattice A4 appears in various other contexts such as sphere packings, efficient coding schemes and lattice quantizers. Here, the lattice A4 is considered within the icosian ring, whose rich arithmetic structure leads to parametrisations of the similar sublattices and the coincidence rotations of A4 and its dual lattice. These parametrisations, both in terms of a single icosian, imply an index formula for the corresponding sublattices. The results are encapsulated in Dirichlet series generating functions. For every index, they provide the number of distinct similar sublattices as well as the number of coincidence rotations of A4 and its dual.Comment: 8 pages, paper presented at ICQ10 (Zurich, Switzerland

    Entanglement Distillation Protocols and Number Theory

    Full text link
    We show that the analysis of entanglement distillation protocols for qudits of arbitrary dimension DD benefits from applying basic concepts from number theory, since the set \zdn associated to Bell diagonal states is a module rather than a vector space. We find that a partition of \zdn into divisor classes characterizes the invariant properties of mixed Bell diagonal states under local permutations. We construct a very general class of recursion protocols by means of unitary operations implementing these local permutations. We study these distillation protocols depending on whether we use twirling operations in the intermediate steps or not, and we study them both analitically and numerically with Monte Carlo methods. In the absence of twirling operations, we construct extensions of the quantum privacy algorithms valid for secure communications with qudits of any dimension DD. When DD is a prime number, we show that distillation protocols are optimal both qualitatively and quantitatively.Comment: REVTEX4 file, 7 color figures, 2 table

    Space Laser Power Transmission System Studies

    Get PDF
    Power transmission by laser technique is addressed. Space to Earth and space to space configurations are considered

    Fluid-solid transition in hard hyper-sphere systems

    Full text link
    In this work we present a numerical study, based on molecular dynamics simulations, to estimate the freezing point of hard spheres and hypersphere systems in dimension D = 4, 5, 6 and 7. We have studied the changes of the Radial Distribution Function (RDF) as a function of density in the coexistence region. We started our simulations from crystalline states with densities above the melting point, and moved down to densities in the liquid state below the freezing point. For all the examined dimensions (including D = 3) it was observed that the height of the first minimum of the RDF changes in an almost continuous way around the freezing density and resembles a second order phase transition. With these results we propose a numerical method to estimate the freezing point as a function of the dimension D using numerical fits and semiempirical approaches. We find that the estimated values of the freezing point are very close to previously reported values from simulations and theoretical approaches up to D = 6 reinforcing the validity of the proposed method. This was also applied to numerical simulations for D = 7 giving new estimations of the freezing point for this dimensionality.Comment: 13 pages, 10 figure

    Professional boundaries: research report

    Get PDF
    In 2008 the General Social Care Council (GSCC) published Raising standards: Social work conduct in England 2003-2008. This constituted the GSCC’s first report covering the work undertaken to uphold standards and protect people who use social care services. The GSCC’s analysis revealed that a considerable proportion of conduct cases, some 40%, involved allegations of 'inappropriate relations'. In the light of this finding, and the release by the Council for Healthcare Regulatory Excellence (CHRE) of sexual boundaries guidance for healthcare workers at the beginning of this year (Halter et al, 2009), the GSCC committed itself to exploring the possibility of producing professional boundaries guidance for social workers. To begin this exploration, the GSCC commissioned a study in early 2009.This is the report of that study. There were two main purposes. First, to establish what professional boundaries1 guidance currently exists for social workers, or for sections of the workforce that includes social workers in the United Kingdom, and the content of any such guidance. Secondly, to identify and discuss a number of other examples of professional boundaries guidance to act as points of reference for the GSCC’s project. The aim was to identify and discuss examples relevant to the GSCC’s project

    The RAG Model: a new paradigm for genetic risk stratification in multiple myeloma

    Get PDF
    Molecular studies have shown that multiple myeloma is a highly genetically heterogonous disease which may manifest itself as any number of diverse subtypes each with variable clinicopathological features and outcomes. Given this genetic heterogeneity, a universal approach to treatment of myeloma is unlikely to be successful for all patients and instead we should strive for the goal of personalised therapy using rationally informed targeted strategies. Current DNA sequencing technologies allow for whole genome and exome analysis of patient myeloma samples that yield vast amounts of genetic data and provide a mutational overview of the disease. However, the clinical utility of this information currently lags far behind the sequencing technology which is increasingly being incorporated into clinical practice. This paper attempts to address this shortcoming by proposing a novel genetically based “traffic-light” risk stratification system for myeloma, termed the RAG (Red, Amber, Green) model, which represents a simplified concept of how complex genetic data may be compressed into an aggregate risk score. The model aims to incorporate all known clinically important trisomies, translocations, and mutations in myeloma and utilise these to produce a score between 1.0 and 3.0 that can be incorporated into diagnostic, prognostic, and treatment algorithms for the patient
    corecore